Disentangled Representation Learning in Non-Markovian Causal Systems
Authors: Adam Li, Yushu Pan, Elias Bareinboim
NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The theory is corroborated by experiments. |
| Researcher Affiliation | Academia | Adam Li and Yushu Pan and Elias Bareinboim Causal Artificial Intelligence Lab Columbia University {adam.li, yushupan, eb}@cs.columbia.edu |
| Pseudocode | Yes | Algorithm 1 CRID: Algorithm for determining causal representation identifiability |
| Open Source Code | Yes | The code we used to run experiments is here: https://github.com/tree1111/CDRL. |
| Open Datasets | No | The paper uses synthetic data generated according to latent causal diagrams and mentions 'Color MNIST with bar data generation' but does not provide concrete access information (link, DOI, formal citation) for a publicly available dataset. |
| Dataset Splits | No | The paper mentions generating 200,000 data points but does not specify explicit training, validation, or test split percentages or counts for reproducibility. |
| Hardware Specification | Yes | We use NVIDIA H100 GPUs to train the neural network models. |
| Software Dependencies | No | The paper mentions using 'ADAM optimizer [84]' and 'Neural Spline Flows [83]' but does not provide specific version numbers for these or other software libraries. |
| Experiment Setup | Yes | We use the ADAM optimizer [84].We start with a learning rate of 1e-4. We train the model for 200 epochs with a batch size of 4096. |