Flexible Transfer Learning under Support and Model Shift

Authors: Xuezhi Wang, Jeff Schneider

NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate our methods on synthetic data and real-world grape image data. The experimental results show that our transfer learning algorithms significantly outperform existing methods with few labeled target data points.
Researcher Affiliation Academia Xuezhi Wang Computer Science Department Carnegie Mellon University xuezhiw@cs.cmu.edu Jeff Schneider Robotics Institute Carnegie Mellon University schneide@cs.cmu.edu
Pseudocode No The paper describes the steps of the SMS approach in paragraph form, but it does not provide a clearly labeled pseudocode or algorithm block.
Open Source Code No The paper mentions that code for the 'T/C shift' baseline is available at 'http://people.tuebingen.mpg.de/kzhang/Code-Tar S.zip', but it does not provide code for its own proposed methodology.
Open Datasets Yes We have two datasets with grape images taken from vineyards and the number of grapes on them as labels, one is riesling (128 labeled images), another is traminette (96 labeled images), as shown in Figure 3. [...] [3] Nuske, S., Gupta, K., Narasihman, S., and Singh., S. Modeling and calibration visual yield estimates in vineyards. International Conference on Field and Service Robotics, 2012.
Dataset Splits Yes The parameters are chosen by cross-validation.
Hardware Specification Yes In our real-world dataset with 2177 features, it takes about 2.54 minutes on average in a single-threaded MATLAB process on a 3.1 GHz CPU with 8 GB RAM to solve the objective and recover the transformation.
Software Dependencies No The paper mentions 'single-threaded MATLAB process' but does not specify a version number for MATLAB or any other software dependencies.
Experiment Setup No The paper states that 'The parameters are chosen by cross-validation.' but does not provide specific hyperparameter values (e.g., learning rate, batch size) or detailed system-level training settings.