Domain-wise Data Acquisition to Improve Performance under Distribution Shift
Authors: Yue He, Dongbai Li, Pengfei Tian, Han Yu, Jiashuo Liu, Hao Zou, Peng Cui
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive real-world experiments demonstrate our proposal s advantages in machine learning applications. |
| Researcher Affiliation | Academia | 1Department of Computer Science and Technology, Tsinghua University, Beijing, China 2Qiuzhen College, Tsinghua University, Beijing, China 3Zhongguancun Laboratory, Beijing, China. |
| Pseudocode | Yes | Algorithm 1 Domain-wise Active Acquisition (DAA) |
| Open Source Code | Yes | The code is available at https: //github.com/dongbaili/DAA. |
| Open Datasets | Yes | We conduct experimental analyses on the Rotated MNIST dataset (Ghifary et al., 2015). This dataset builds upon the original MNIST dataset (Le Cun et al., 1998) |
| Dataset Splits | No | The paper explicitly defines training and test sets but does not mention a separate validation set or split for hyperparameter tuning or early stopping. |
| Hardware Specification | No | The paper does not specify any hardware details such as GPU or CPU models used for running the experiments. |
| Software Dependencies | No | The paper mentions using 'MNIST CNN1' based on a cited work but does not provide specific version numbers for any software dependencies or libraries. |
| Experiment Setup | Yes | In our study, we establish rotation angles of 30, 60, 120, and 150 degrees as the training domains, reserving the 90 degree rotation as the test domain. In different experiments, we uniformly acquire 600 or 1800 training samples from distinct combinations of domains among [30, 60, 120, 150] degrees, and get 100 unlabeled samples from 90 degree domain additionally. Subsequently, we employ the ERM, sample reweighting, and self-supervised learning methods based on MNIST CNN1 in Gulrajani & Lopez-Paz (2021) independently. |