Convex Learning of Multiple Tasks and their Structure

Authors: Carlo Ciliberto, Youssef Mroueh, Tomaso Poggio, Lorenzo Rosasco

ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We empirically evaluated the efficacy of the block coordinate optimization strategy proposed in this paper on both artificial and real datasets. Synthetic experiments were performed to assess the computational aspects of the approach, while we evaluated the quality of solutions found by the system on realistic settings.
Researcher Affiliation Academia Carlo Ciliberto1,2 CCILIBER@MIT.EDU Youssef Mroueh1,2 YMROUEH@MIT.EDU Tomaso Poggio1,2 TP@AI.MIT.EDU 1Laboratory for Computational and Statistical Learning, Istituto Italiano di Tecnologia, Via Morego 30, Genova, Italy 2Center for Brains Minds and Machines, Massachusetts Institute of Technology, Cambridge, MA 02139 USA Lorenzo Rosasco1,2,3 LROSASCO@MIT.EDU 3DIBRIS, Universit a di Genova, Via Dodecaneso, 35, 16146, Genova, Italy
Pseudocode Yes Algorithm 1 CONVEX MULTI-TASK LEARNING
Open Source Code No On these learning problems we compared the computational performance of our alternating minimization strategy and the original optimization algorithms originally proposed for MTCL and MTFL and for which the code has been made available by the authors .
Open Datasets Yes Sarcos2 is a dataset for regression problems (21-dimensional inputs and 7 outputs)... 2urlhttp://www.gaussianprocess.org/gpml/data/ and 15-Scenes3 is a dataset designed for scene recognition... 3http://www-cvr.ai.uiuc.edu/ponce grp/data/
Dataset Splits Yes For each task, we randomly sampled 50, 100, 150 and 200 training examples while we kept a test set of 5000 examples in common for all tasks. We used a linear kernel and performed 5-fold cross-validation to find the best regularization parameter according to the normalized mean squared error (n MSE) of predicted torques.
Hardware Specification No The paper does not specify the hardware (e.g., CPU, GPU, RAM, specific machine models) used for running the experiments.
Software Dependencies No The paper does not provide specific version numbers for software dependencies or libraries used in the experimental setup.
Experiment Setup Yes We used a linear kernel and performed 5-fold cross-validation to find the best regularization parameter according to the normalized mean squared error (n MSE) of predicted torques. and We used least squares loss for all experiments. and In our algorithm we used A0 = I identity matrix as initialization for the alternating minimization procedure. and We represented images using LLC coding (Wang et al., 2010).