Which Invariance Should We Transfer? A Causal Minimax Learning Approach

Authors: Mingzhou Liu, Xiangyu Zheng, Xinwei Sun, Fang Fang, Yizhou Wang

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The effectiveness and efficiency of our methods are demonstrated on synthetic data and the diagnosis of Alzheimer s disease.
Researcher Affiliation Academia 1Sch. of Computer Science, Peking University 2Center on Frontiers of Computing Studies, Peking University 3Dep. of Statistics, Guanghua Sch. of Management, Peking University 4Sch. of Data Science, Fudan University 5Sch. of Psychological and Cognitive Sciences, Peking University 6Inst. for Artificial Intelligence, Peking University.
Pseudocode Yes Algorithm 1 Optimal subset S selection. Algorithm 2 Equivalence classes recovery.
Open Source Code Yes Code is available at https://github.com/lmz123321/which_invariance.
Open Datasets Yes We consider the Alzheimer s Disease Neuroimaging Initiative (Petersen et al., 2010) (ADNI) dataset... We consider the International Mouse Phenotyping Consortium (IMPC) dataset... (Magliacane et al., 2018).
Dataset Splits Yes Surgery estimator (Subbaswamy et al., 2019) that used validation s loss to identify the optimal subset;... For the estimation of L, the epochs are set to 12000 with an early stop
Hardware Specification Yes All codes are implemented with Py Torch 1.10 and run on an Intel Xeon E5-2699A v4@2.40GHz CPU.
Software Dependencies Yes All codes are implemented with Py Torch 1.10 and run on an Intel Xeon E5-2699A v4@2.40GHz CPU.
Experiment Setup Yes The learning rate is set to 0.02, and epochs are set to 10000 with an early stop... For the estimation of f S , the epochs are set to 5000, the learning rate is set to 0.25 in the first 4000 epochs, and decreased to 0.1 in the last 1000 epochs. For the estimation of L, the epochs are set to 12000 with an early stop, the learning rate is set to 0.4.