MolGrow: A Graph Normalizing Flow for Hierarchical Molecular Generation
Authors: Maksim Kuznetsov, Daniil Polykovskiy8226-8234
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments We consider three problems: distribution learning, global molecular property optimization and constrained optimization. For all the experiments, we provide model and optimization hyperparameters in supplementary material A; the source code for reproducing all the experiments is provided in supplementary materials. |
| Researcher Affiliation | Industry | Maksim Kuznetsov, Daniil Polykovskiy Insilico Medicine kuznetsov@insilico.com, daniil@insilico.com |
| Pseudocode | Yes | Algorithm 1 Attention on complete graph edges (CAGE) |
| Open Source Code | Yes | For all the experiments, we provide model and optimization hyperparameters in supplementary material A; the source code for reproducing all the experiments is provided in supplementary materials. |
| Open Datasets | Yes | We report the results on MOSES (Polykovskiy et al. 2020) dataset in Table 1. We provide the results on QM9 (Ramakrishnan et al. 2014) and ZINC250k (Kusner, Paige, and Hern andez-Lobato 2017) datasets in supplementary material B. |
| Dataset Splits | No | The paper mentions 'train' and 'test' sets in the context of evaluating models and training objectives, but does not provide specific details on how the datasets were split into training, validation, and test sets (e.g., percentages, sample counts, or a detailed splitting methodology). It mentions 'maximizing training set log-likelihood' and 'train on a large dataset' but no specific split information. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details, such as library names with version numbers, needed to replicate the experiment. |
| Experiment Setup | Yes | For all the experiments, we provide model and optimization hyperparameters in supplementary material A; the source code for reproducing all the experiments is provided in supplementary materials. |