HiABP: Hierarchical Initialized ABP for Unsupervised Representation Learning
Authors: Jiankai Sun, Rui Liu, Bolei Zhou9747-9755
AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results validate our framework can outperform other popular deep generative models in modeling natural images and learning from incomplete data. We further demonstrate the unsupervised disentanglement of hierarchical latent representation with controllable image synthesis. |
| Researcher Affiliation | Academia | Jiankai Sun,1,2 Rui Liu,2 Bolei Zhou1,2 1 Centre for Perceptual and Interactive Intelligence 2 The Chinese University of Hong Kong {sj019, bzhou}@ie.cuhk.edu.hk, ruiliu@link.cuhk.edu.hk |
| Pseudocode | Yes | Algorithm 1 Hierarchical Initialized Alternating Back-propgation (Hi ABP) |
| Open Source Code | No | The paper does not provide an explicit statement or link for the open-source code of the described methodology. |
| Open Datasets | Yes | Three datasets are employed MNIST, Street View House Number (SVHN) (Netzer et al. 2011), and Celeb A (Align & Cropped version) with the respective training and test partitions. |
| Dataset Splits | No | The paper mentions 'training and test partitions' but does not explicitly state a validation split or specific percentages/counts for any split. |
| Hardware Specification | Yes | Experiments are run on NVIDIA GTX Titan X. |
| Software Dependencies | Yes | We code all models in Python 3.6, Sci Py 1.0.0, and Tensorflow 1.15. |
| Experiment Setup | Yes | The learning rate ηφ = ηθ = 0.0003. If not stated otherwise, we use Langevin inference steps T = 15, σ = 0.3, step size s = 0.15, batch size m = 100. |