Contrastive Learning Inverts the Data Generating Process
Authors: Roland S. Zimmermann, Yash Sharma, Steffen Schneider, Matthias Bethge, Wieland Brendel
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically verify our predictions when the assumed theoretical conditions are fulfilled. In addition, we show successful inversion of the data generating process even if these theoretical assumptions are partially violated. Tables 1 and 2 show results evaluating identifiability up to affine transformations and generalized permutations, respectively. |
| Researcher Affiliation | Academia | 1University of T ubingen, T ubingen, Germany 2IMPRS for Intelligent Systems, T ubingen, Germany 3EPFL, Geneva, Switzerland. |
| Pseudocode | No | No structured pseudocode or algorithm blocks were found in the paper. |
| Open Source Code | Yes | 1Online version and code: brendel-group.github.io/cl-ica/ |
| Open Datasets | Yes | 3DIdent is available at zenodo.org/record/4502485. |
| Dataset Splits | No | The paper mentions a test set for 3DIdent but does not explicitly provide details for training/validation/test splits (e.g., percentages or sample counts for each split). |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions software like Blender and FAISS by name and citation, but does not provide specific version numbers for these or other software dependencies used in the experiments. |
| Experiment Setup | Yes | For further details, see Appx. A.3. In our experiments, we use the same training hyperparameters (for details see Appx. A.3) and (encoder) architecture as Klindt et al. (2021). |