Multi-task Additive Models for Robust Estimation and Automatic Structure Discovery
Authors: Yingjie Wang, Hong Chen, Feng Zheng, Chen Xu, Tieliang Gong, Yanhong Chen
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on simulations and the CMEs analysis demonstrate the competitive performance of our approach for robust estimation and automatic structure discovery. |
| Researcher Affiliation | Academia | 1College of Informatics, Huazhong Agricultural University, China 2 College of Science, Huazhong Agricultural University, China 3 Department of Computer Science and Engineering, Southern University of Science and Technology, China 4 Department of Mathematics and Statistics, University of Ottawa, Canada 5 School of Computer Science and Technology, Xi an Jiaotong University, China 6 National Space Science Center, Chinese Academy of Sciences, China |
| Pseudocode | Yes | Algorithm 1: Prox-SAGA for MAM |
| Open Source Code | No | The paper does not provide an explicit statement about the release of source code for the described methodology, nor does it include a link to a code repository. |
| Open Datasets | Yes | Interplanetary CMEs (ICMEs) data are provided in The Richardson and Cane List (http://www.srl.caltech.edu/ACE/ASC/DATA/level3/ icmetable2.htm). From this link, we collect 137 ICMEs observations from 1996 to 2016. The features of CMEs are provided in SOHO LASCO CME Catalog (https://cdaw.gsfc. nasa.gov/CME_list/). In-situ solar wind parameters can be downloaded from OMNIWeb Plus (https://omniweb.gsfc.nasa.gov/). |
| Dataset Splits | Yes | Without loss of generality, we split each S(t) into the training set S(t) train and the validation set S(t) val with the same sample size n for subsequent analysis. |
| Hardware Specification | Yes | All experiments are implemented in MATLAB 2019b on an intel Core i7 with 16 GB memory. |
| Software Dependencies | Yes | All experiments are implemented in MATLAB 2019b on an intel Core i7 with 16 GB memory. |
| Experiment Setup | Yes | For the same hyper-parameters in Bi GL and MAM, we set Z = 3000, µ = 10 3, M = 5, Q = 100 and σ = 2. We search the regularization parameter λ in the range of {10 4, 10 3, 10 2, 10 1}. Here, we assume the actual number of groups is known, i.e., L = L . The weight for each group is set to be τl = 1, l {1, ..., L}. Following the same strategy in [11], we choose the initialization ϑ(0) = Pϑ( 1LIP L + 0.01N(0P L, IP L)) RP L and ν(0) = (0.5, ..., 0.5)T RP . |