Learning Semantic Representations for Unsupervised Domain Adaptation
Authors: Shaoan Xie, Zibin Zheng, Liang Chen, Chuan Chen
ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments testify that our model yields state of the art results on standard datasets. |
| Researcher Affiliation | Academia | 1School of Data and Computer Science, Sun Yat-sen University, Guangzhou, China 2National Engineering Research Center of Digital Life, Sun Yat-sen University, Guangzhou, China. |
| Pseudocode | Yes | Algorithm 1 Moving semantic transfer loss computation in iteration t in our model. |
| Open Source Code | Yes | Codes are available at https://github.com/Mid-Push/Moving-Semantic-Transfer-Network. |
| Open Datasets | Yes | Office-31 (Saenko et al., 2010) is a standard dataset used for domain adaptation. MNIST (Le Cun et al., 1998), USPS and SVNH (Netzer et al., 2011). |
| Dataset Splits | No | The paper mentions applying “reverse validation” for hyperparameter tuning but does not explicitly provide percentages or counts for training/validation/test dataset splits for the experiments. |
| Hardware Specification | No | The paper mentions the network architectures used (Alex Net, CNN) but does not specify any hardware details such as CPU/GPU models, memory, or cloud resources used for experiments. |
| Software Dependencies | No | The paper discusses optimization methods (Stochastic gradient descent) but does not specify software dependencies with version numbers (e.g., Python version, PyTorch/TensorFlow version, CUDA version). |
| Experiment Setup | Yes | We set θ=0.7 in all our experiments. For the weight balance parameter, we set λ = 2 1+exp( γ.p) 1, where γ is set to 10 and p is training progress changing from 0 to 1. ... Stochastic gradient descent with 0.9 momentum is used. The learning rate is annealed by µp = µ0 (1+α.p)β , where µ0=0.01, α=10 and β=0.75... We set the batch size to 128 for each domain. |