Hypothesis Transfer Learning via Transformation Functions

Authors: Simon S. Du, Jayanth Koushik, Aarti Singh, Barnabas Poczos

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on real world data demonstrate the effectiveness of our framework.
Researcher Affiliation Academia Simon S. Du Carnegie Mellon University ssdu@cs.cmu.edu
Pseudocode Yes Algorithm 1 Transformation Function based Transfer Learning
Open Source Code No The paper does not provide an explicit statement about releasing source code or a link to a code repository for the methodology described.
Open Datasets Yes We use two datasets from the kin family in Delve [Rasmussen et al., 1996].
Dataset Splits Yes Hyper-parameters were picked using grid search with 10-fold cross-validation on the target data (or source domain data when not using the target domain data).
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments.
Software Dependencies No The paper does not specify version numbers for any software components or libraries used.
Experiment Setup Yes We set nso to 320, and vary nta in {10, 20, 40, 80, 160, 320}. Hyper-parameters were picked using grid search with 10-fold cross-validation on the target data (or source domain data when not using the target domain data).