On the Generative Utility of Cyclic Conditionals
Authors: Chang Liu, Haoyue Tang, Tao Qin, Jintao Wang, Tie-Yan Liu
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | With the prior constraint removed, Cy Gen better fits data and captures more representative features, supported by both synthetic and real-world experiments. |
| Researcher Affiliation | Collaboration | 1 Microsoft Research Asia, Beijing, 100080. 2 Tsinghua University, Beijing, 100084. |
| Pseudocode | No | The paper describes methods using mathematical formulas but does not provide pseudocode or algorithm blocks. |
| Open Source Code | Yes | Codes: https://github.com/changliu00/cygen. |
| Open Datasets | Yes | We test the performance of Cy Gen on real-world image datasets MNIST and SVHN. |
| Dataset Splits | No | The paper mentions using MNIST and SVHN datasets but does not explicitly provide details about training, validation, or test splits. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used to run the experiments. |
| Software Dependencies | No | The paper mentions software like PyTorch in its references but does not explicitly list the software dependencies with specific version numbers used for its implementation. |
| Experiment Setup | Yes | All models are trained by Adam optimizer [43] with a learning rate 10 3. ... For MNIST, the dimension of latent space d Z = 10; for SVHN, d Z = 32. For both datasets, the number of Sylvester flows is 8, which means 8 Householder transformations. ... We train Cy Gen for 30,000 iterations for MNIST, and 100,000 for SVHN. Batch size is 128 for both datasets. |