Sparse Invariant Risk Minimization
Authors: Xiao Zhou, Yong Lin, Weizhong Zhang, Tong Zhang
ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically we demonstrate the power of Sparse IRM through various datasets and models and surpass state-of-the-art methods with a gap up to 29%. |
| Researcher Affiliation | Collaboration | 1The Hong Kong University of Science and Technology 2Google Research. |
| Pseudocode | Yes | Algorithm 1 Sparse Invariant Risk Minimization (Sparse IRM) |
| Open Source Code | No | The paper does not contain an explicit statement about releasing source code for their method nor does it provide a link to a repository. |
| Open Datasets | Yes | We conduct a series of experiments on benchmarks which are widely-used in latest studies (Arjovsky et al., 2019; Ahmed et al., 2020) to justify the superiority of our Sparse IRM. [...] datasets Colored MNIST (Arjovsky et al., 2019) and Full Colored MNIST (Ahmed et al., 2020) [...] CIFARMNIST and Colored Object datasets. |
| Dataset Splits | No | The paper discusses "training environments" and "testing environments" but does not explicitly mention the use of a separate "validation" dataset split or specific percentages/counts for such a split. |
| Hardware Specification | No | The paper does not provide any specific hardware details such as CPU, GPU models, or cloud resources used for running the experiments. |
| Software Dependencies | No | The paper mentions "Tensor Flow and Py Torch" as general deep neural network training platforms but does not specify their version numbers or any other software dependencies with versions. |
| Experiment Setup | Yes | D. Experimental Configurations Dataset C/FCM CM CO Epochs 1500 50 75 Weight Optimizer Adam SGD SGD Weight Learning Rate 0.0004 0.01 0.01 Weight Momentum 0.9 0.9 Probability Optimizer Adam Adam Adam Probability Learning Rate 6e-3 6e-3 6e-3 Penalty Weight 10000 10000 10000 Learning Rate Scheduler Cosine Cosine Cosine |