Taxonomy-Structured Domain Adaptation
Authors: Tianyi Liu, Zihao Xu, Hao He, Guang-Yuan Hao, Guang-He Lee, Hao Wang
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, our method achieves state-of-the-art performance on both synthetic and real-world datasets with successful adaptation. |
| Researcher Affiliation | Academia | 1Rutgers University 2Massachusetts Institute of Technology 3The Chinese University of Hong Kong. |
| Pseudocode | Yes | We summarize the training procedure formally in Algorithm 1. ... The inference procedure of TSDA is formally presented in Algorithm 2. |
| Open Source Code | Yes | Code is available at https: //github.com/Wang-ML-Lab/TSDA. |
| Open Datasets | Yes | Image Net-Attribute-DT (Ouyang et al., 2015) builds on the animal images from Image Net... CUB-DT (He & Peng, 2019) contains 11,788 images of 200 bird categories. |
| Dataset Splits | No | The paper describes source and target domains for training and testing, but it does not specify explicit validation dataset splits (e.g., percentages, counts, or predefined sets) for model tuning or evaluation beyond the source-target distinction. |
| Hardware Specification | Yes | All experiments are run on NVDIA Ge Force RTX 2080 Ti GPUs. |
| Software Dependencies | No | The paper mentions using 'Py Torch' and 'Adam optimizer (Kingma & Ba, 2015)' but does not provide specific version numbers for these or other software dependencies. |
| Experiment Setup | Yes | We have λd, λt and λe as the weights that balance the discriminator loss, the taxonomist loss, and the predictor loss. ... λd and λe range from 0.1 to 1 and λt ranges from 0.1 to 10. ... We use Adam optimizer (Kingma & Ba, 2015) for all models with learning rates from 1 10 4 to 1 10 6. ... Table 4 shows the experiment results of various λd, λt combination on DT-14. |