On the Identifiability of Nonlinear ICA: Sparsity and Beyond

Authors: Yujia Zheng, Ignavier Ng, Kun Zhang

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental 5 Experiments To validate the proposed theory of the identifiability of nonlinear ICA with unconditional priors, we conduct experiments based on our main condition, i.e., structural sparsity (Thm. 1), as well as the condition of the independent influences (Prop. 2).
Researcher Affiliation Academia 1 Carnegie Mellon University 2 Mohamed bin Zayed University of Artificial Intelligence {yujiazh, ignavierng, kunz1}@cmu.edu
Pseudocode No The paper contains mathematical formulations and theorems but does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code No The paper does not include any explicit statements about releasing source code or providing a link to a code repository for the described methodology.
Open Datasets Yes To study how reasonable the proposed theories are w.r.t. the practical generating process of observational data, we conduct experiments on the Triangles' image dataset (Yang et al., 2022).
Dataset Splits No The paper mentions training models and conducting an ablation study but does not explicitly provide details about training, validation, and test dataset splits (e.g., percentages or sample counts).
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models, or cloud computing instance types, used for conducting the experiments.
Software Dependencies No The paper refers to using a 'GIN' model for training and specific norms (L1, L0) but does not provide specific software dependencies with version numbers (e.g., Python, PyTorch versions).
Experiment Setup No The paper mentions aspects of the experimental setup like the objective function and regularization terms, but it does not provide specific details such as hyperparameter values (e.g., learning rate, batch size, number of epochs) or optimizer settings.