Domain Adaptation with Invariant Representation Learning: What Transformations to Learn?
Authors: Petar Stojanov, Zijian Li, Mingming Gong, Ruichu Cai, Jaime Carbonell, Kun Zhang
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the efficacy of our method via synthetic and real-world data experiments. |
| Researcher Affiliation | Academia | 1 Carnegie Mellon University 2 School of Computer Science, Guangdong University of Technology 3 School of Mathematics and Statistics, University of Melbourne 4 Broad Institute of MIT and Harvard |
| Pseudocode | No | The paper describes the method verbally and with a diagram (Figure 3), but does not include any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at: https://github.com/DMIRLAB-Group/DSAN. |
| Open Datasets | Yes | To evaluate our method on real datasets, we consider three datasets and respective tasks from various domains of applications: cross-domain Wi-Fi localization, Amazon product reviews and image classification. |
| Dataset Splits | No | No explicit dataset split percentages or sample counts for training, validation, or test sets are provided in the main text. Details are referred to supplementary materials. |
| Hardware Specification | No | No specific hardware details such as GPU/CPU models, memory, or computing infrastructure are mentioned in the paper. |
| Software Dependencies | No | No specific software dependencies with version numbers (e.g., Python version, library versions like PyTorch, TensorFlow, etc.) are mentioned in the paper. |
| Experiment Setup | No | For detailed descriptions of the experimental design, hyperparameter tuning and neural network architectures, as well as ablation studies, we refer the interested reader to the supplementary materials. |