IDEA: An Invariant Perspective for Efficient Domain Adaptive Image Retrieval
Authors: Haixin Wang, Hao Wu, Jinan Sun, Shikun Zhang, Chong Chen, Xian-Sheng Hua, Xiao Luo
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Comprehensive experiments conducted on benchmark datasets confirm the superior performance of our proposed IDEA compared to a variety of competitive baselines. |
| Researcher Affiliation | Collaboration | 1Peking University, 2University of Science and Technology of China, 3Terminus Group, 4University of California, Los Angeles |
| Pseudocode | Yes | Algorithm 1 Training Algorithm of IDEA |
| Open Source Code | No | The paper does not include a statement about releasing source code or a link to a code repository. |
| Open Datasets | Yes | Experiments are conducted on different benchmark datasets: (1) Office-Home dataset [57]: ... (2) Office-31 dataset [42]: ... (3) Digits dataset: We focus on MNIST [26] and USPS [23]... |
| Dataset Splits | No | The paper specifies a 'training set' and 'test queries' but does not explicitly mention a separate 'validation' set or its split. |
| Hardware Specification | Yes | We perform experiments on an A100-40GB GPU. |
| Software Dependencies | No | The paper mentions 'mini-batch SGD with momentum' for optimization but does not specify software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | The batch size is set to 36 and the learning rate is fixed as 0.001. |