Principal Component Flows

Authors: Edmond Cunningham, Adam D Cobb, Susmit Jha

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In our experiments we show that PCFs and i PCFs are able to learn the principal manifolds over a variety of datasets. Additionally, we show that PCFs can perform density estimation on data that lie on a manifold with variable dimensionality, which is not possible with existing normalizing flows.
Researcher Affiliation Collaboration 1University of Massachusetts 2SRI International.
Pseudocode Yes Appendix A. Python implementation
Open Source Code Yes We provide Python code in appendix Appendix A.
Open Datasets Yes We trained an i PCF and standard injective normalizing flow (i NF) on the MNIST dataset (Lecun et al., 1998).
Dataset Splits No The paper specifies training and test splits for datasets (e.g., 700,000 for training and 300,000 for testing for synthetic datasets), but does not explicitly mention a separate validation split or how it was handled.
Hardware Specification Yes Each model was trained for approximately 4 hours on either a NVIDIA 1080ti or 2080ti. ... We trained for around 4 hours on a NVIDIA 3090 gpu ... These models were trained for approximately 36 hours on either a NVIDIA 3090ti or RTX8000 gpu.
Software Dependencies No Our experiments were written using the JAX (Bradbury et al., 2018) Python library. The paper mentions JAX but does not provide specific version numbers for JAX or any other software libraries used.
Experiment Setup Yes The models were trained using the Ada Belief (Zhuang et al., 2020) optimization algorithm with a learning rate of 1 × 10−3 and a batch size of 256, and α = 10.0.