Compositional Sculpting of Iterative Generative Processes
Authors: Timur Garipov, Sebastiaan De Peuter, Ge Yang, Vikas Garg, Samuel Kaski, Tommi Jaakkola
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We offer empirical results on image and molecular generation tasks. and 6 Experiments. |
| Researcher Affiliation | Collaboration | Timur Garipov1 Sebastiaan De Peuter2 Ge Yang1,4 Vikas Garg2,5 Samuel Kaski2,3 Tommi Jaakkola1 1MIT CSAIL 2Aalto University 3University of Manchester 4Institute for Artificial Intelligence and Fundamental Interactions 5Yai Yai Ltd |
| Pseudocode | Yes | Algorithm A.1 Compositional Sculpting: classifier training |
| Open Source Code | Yes | Project codebase: https://github.com/timgaripov/compositional-sculpting. |
| Open Datasets | Yes | three diffusion models trained to generate MNIST [56] digits {0, 1, 2, 3} in two colors: cyan and beige. and [56] Yann Le Cun. The mnist database of handwritten digits. http://yann.lecun.com/exdb/mnist/, 1998. |
| Dataset Splits | No | The paper mentions training steps and batch sizes but does not specify explicit train/validation/test dataset splits with percentages or counts. |
| Hardware Specification | Yes | All models were trained with a single Ge Force RTX 2080 Ti GPU. and All models were trained with a single Tesla V100 GPU. |
| Software Dependencies | No | The paper mentions using PyTorch [68] and optimizers like Adam [59] and Ada Delta [67] but does not provide specific version numbers for these software components (e.g., 'PyTorch 1.x' or 'Python 3.x'). |
| Experiment Setup | Yes | We used Adam optimizer [59] with learning rate 0.001, and pre-train the base models for 20 000 steps with batch size 16 (16 trajectories per batch). and The score model was trained using Adam optimizer [59] with a learning rate decreasing exponentially from 10 2 to 10 4. We performed 200 training steps with batch size 32. |