A convex optimization formulation for multivariate regression
Authors: Yunzhang Zhu
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirically, the proposed methods outperform existing high-dimensional multivariate linear regression methods that are based on either minimizing certain non-convex criteria or certain two-step procedures. In summary, our simulation results suggest that the proposed methods achieve higher accuracy of sparse identification and parameter estimation, compared to other competitors. |
| Researcher Affiliation | Academia | Yunzhang Zhu Department of Statistics Ohio State University Columbus, OH 43210 zhu.219@osu.edu |
| Pseudocode | No | The paper mentions algorithms (e.g., "proximal Newton algorithm", "FISTA algorithm") and states that "Details of these algorithm derivations are included in the Appendix," but the provided text does not contain any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statement about making its source code available or include a link to a code repository. |
| Open Datasets | No | The numerical experiments use simulated data: "In what follows, we consider simulated data from the multivariate regression model..." No publicly available dataset is mentioned with concrete access information. |
| Dataset Splits | Yes | With regard to selection of tuning parameters, we fix τΩ= .01 and τB = .01 q, and propose to use a vanilla cross-validation to choose the optimal tuning parameter (λB, λΩ) for our methods by minimizing a Kullback-Liebler criterion using a five-fold CV. |
| Hardware Specification | No | The paper does not specify any hardware details (e.g., GPU models, CPU types, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions using an "R package provided by [Rothman et al., 2008]" but does not specify any version numbers for this or any other software dependencies, nor does it list its own implementation dependencies with versions. |
| Experiment Setup | Yes | With regard to selection of tuning parameters, we fix τΩ= .01 and τB = .01 q, and propose to use a vanilla cross-validation to choose the optimal tuning parameter (λB, λΩ) for our methods by minimizing a Kullback-Liebler criterion using a five-fold CV. |