Fast and Robust Multi-View Multi-Task Learning via Group Sparsity
Authors: Lu Sun, Canh Hao Nguyen, Hiroshi Mamitsuka
IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on various synthetic and realworld datasets demonstrate its effectiveness. ... 5 Experiments |
| Researcher Affiliation | Academia | 1Bioinformatics Center, Institute for Chemical Research, Kyoto University, Japan 2Department of Computer Science, Aalto University, Finland |
| Pseudocode | No | The paper describes the steps of the optimization algorithm textually in Section 4 (a), (b), (c), and refers to a 'procedure of the algorithm is provided in the supplement' in a footnote, but does not include structured pseudocode or an algorithm block within the main paper. |
| Open Source Code | Yes | We provide the MATLAB code of AGILE at: https://github. com/futuresun912/AGILE. |
| Open Datasets | Yes | We conduct experiments on four real-world heterogeneous datasets: FOX, Mirflickr, NUS-Scene and NUS-Object. The FOX dataset is extracted from FOX web news [Qian and Zhai, 2014], while Mirflickr, NUS-Scene and NUS-Object refer to image annotation problem [Huiskes and Lew, 2008; Chua et al., 2009]. |
| Dataset Splits | Yes | For evaluation, in each task, we randomly select a%, a%, 20% and 20% of total samples as labeled training set, unlabeled training set, validation set and testing set, respectively, and a is selected from {10, 20, 30}. |
| Hardware Specification | No | The paper does not provide any specific hardware details such as GPU models, CPU specifications, or memory used for running the experiments. |
| Software Dependencies | No | The paper mentions 'MATLAB code' in a footnote but does not provide specific version numbers for MATLAB or any other software libraries or dependencies used in the experiments. |
| Experiment Setup | Yes | In parameter setting, the weight balancing ℓ1 and ℓ2 regularizers in Elastic-Net is selected from {0.2, 0.4, 0.6, 0.8, 1}. ... Values for other parameters are selected from {10a |a| [3]}. For each iterative algorithm, we terminate it once the relative change of its objective is below 10 5, and set the maximum number of iterations as 500. ... AGILE with the setting α = 10, β = 1 and γ = 36. |