Group LASSO with Asymmetric Structure Estimation for Multi-Task Learning
Authors: Saullo H. G. Oliveira, André R. Gonçalves, Fernando J. Von Zuben
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We performed experiments using synthetic and real datasets to compare our proposal with state-of-the-art approaches, evidencing the promising predictive performance and distinguished interpretability of our proposal. |
| Researcher Affiliation | Collaboration | 1School of Electrical and Computer Engineering FEEC, University of Campinas Unicamp, Brazil 2Lawrence Livermore National Laboratory, USA |
| Pseudocode | Yes | The complete process is presented in Algorithm (1). |
| Open Source Code | Yes | The source codes are available at Git Hub. (...) The Python code associated with GAMTL is available online 1. https://github.com/shgo/gamtl |
| Open Datasets | Yes | The ADNI dataset was collected by the Alzheimer s Disease Neuroimaging Initiative (ADNI) and pre-processed by a team from University of California at San Francisco, as described in [Liu et al., 2018] |
| Dataset Splits | Yes | For each amount of samples, the parameters of all methods were chosen by crossvalidation using 30% of the training set. (...) Regularization parameters for the methods are chosen by a 5-fold cross-validation procedure using training data. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running the experiments, such as GPU or CPU models. |
| Software Dependencies | No | The paper mentions 'The Python code associated with GAMTL is available online' but does not specify version numbers for Python or any libraries used (e.g., PyTorch, TensorFlow, scikit-learn). |
| Experiment Setup | Yes | All variants of GAMTL used λ1 [10e 5, , 0.03], λ2 [0.01, , 0.5], and λ3 [0.008, , 0.15]. (...) To account for variability in the data, 30 independent executions were performed. |