Learning Causal Semantic Representation for Out-of-Distribution Prediction
Authors: Chang Liu, Xinwei Sun, Jindong Wang, Haoyue Tang, Tao Li, Tao Qin, Wei Chen, Tie-Yan Liu
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical study shows improved OOD performance over prevailing baselines. We develop effective methods for OOD generalization and domain adaptation, and achieve mostly better performance than prevailing methods on real-world image classification tasks. |
| Researcher Affiliation | Collaboration | 1 Microsoft Research Asia, Beijing, 100080. 2 Tsinghua University, Beijing, 100084. 3 Peking University, Beijing, 100871. |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. Methods are described textually and mathematically. |
| Open Source Code | Yes | Codes are available at https://github.com/changliu00/causal-semantic-generative-model. |
| Open Datasets | Yes | Shifted-MNIST. Image CLEF-DA is a standard benchmark for domain adaptation [1]. PACS is a more recent benchmark dataset [69]. VLCS [30]. |
| Dataset Splits | Yes | For our methods, we use a validation set from the training domain for model selection. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. It only discusses model architectures and datasets. |
| Software Dependencies | No | The paper mentions 'PyTorch' [84] but does not specify a version number for it or any other ancillary software components, which is necessary for reproducibility. |
| Experiment Setup | Yes | In practice x often has a much larger dimension than y, making the first supervision term overwhelmed by the second unsupervised term in Eqs. (2,3,5). So we downscale the second term. |