Learning Energy Decompositions for Partial Inference in GFlowNets
Authors: Hyosoon Jang, Minsu Kim, Sungsoo Ahn
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically verify the superiority of LED-GFN in five problems including the generation of unstructured and maximum independent sets, molecular graphs, and RNA sequences. |
| Researcher Affiliation | Academia | Hyosoon Jang1, Minsu Kim2, Sungsoo Ahn1 1POSTECH 2KAIST |
| Pseudocode | Yes | Algorithm 1 Learning energy decomposition for GFlow Net |
| Open Source Code | Yes | In the supplementary materials, we also include the codes for molecule generation tasks based on the official implementation codes of the prior study (Pan et al., 2023a). |
| Open Datasets | Yes | We extensively validate LED-GFN on various tasks: set generation (Pan et al., 2023a), bag generation (Shen et al., 2023), molecular discovery (Bengio et al., 2021b), RNA sequence generation (Jain et al., 2022), and the maximum independent set problem (Zhang et al., 2023). |
| Dataset Splits | No | The paper references prior studies for experimental settings, but does not explicitly provide specific training/test/validation dataset splits (percentages or counts) within its text. |
| Hardware Specification | Yes | We use a single GPU of NVIDIA A5000 for this experiment. |
| Software Dependencies | No | The paper mentions the use of certain models and frameworks (e.g., Message Passing Neural Networks, PPO) but does not provide specific version numbers for software dependencies or libraries. |
| Experiment Setup | Yes | In each round, we generate B1 = 32 bags from the policy. The GFlow Net model consists of two hidden layers with 16 hidden dimensions, which is trained with a learning rate of 1e 4. We use an exploration epsilon of 0.01. [...] We set the number of iterations in energy decomposition learning N = 8 for each round. |