DGD^2: A Linearly Convergent Distributed Algorithm For High-dimensional Statistical Recovery
Authors: Marie Maros, Gesualdo Scutari
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | To demonstrate the effectiveness of DGD2, we conduct numerical experiments on various high-dimensional statistical recovery problems including group Lasso, sparse logistic regression, and sparse covariance matrix estimation. |
| Researcher Affiliation | Collaboration | Zheng Qu University of California, Berkeley zqu@berkeley.edu... Yang Shen Microsoft Research yshen@microsoft.com... Jelena Kovacevic Carnegie Mellon University jelenak@cmu.edu... Mingyi Hong University of Minnesota mingyihong@umn.edu |
| Pseudocode | Yes | Algorithm 1: DGD2 for distributed statistical recovery |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code or links to a code repository for the described methodology. |
| Open Datasets | Yes | For sparse logistic regression, we downloaded the RCV1 dataset from LIBSVM Data [37]. For sparse covariance matrix estimation, we applied DGD2 to the Million Song Dataset, which is obtained from the UCI Machine Learning Repository. |
| Dataset Splits | No | The paper does not provide specific details on training, validation, and test dataset splits (e.g., percentages, sample counts, or explicit splitting methodology) needed for reproduction. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, memory amounts, or detailed computer specifications) used for running its experiments. |
| Software Dependencies | Yes | All simulations are implemented in MATLAB R2018a. |
| Experiment Setup | Yes | Unless otherwise specified, for all experiments, the stepsize is set to γk = 0.001 and the regularization parameter λ = 0.001. The maximum iteration number is 5000. |