Estimation with Norm Regularization
Authors: Arindam Banerjee, Sheng Chen, Farideh Fazayeli, Vidyashankar Sivakumar
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | This paper presents generalizations of such estimation error analysis on all four aspects. We characterize the restricted error set, establish relations between error sets for the constrained and regularized problems, and present an estimation error bound applicable to any norm. Precise characterizations of the bound is presented for a variety of noise models, design matrices, including sub-Gaussian, anisotropic, and dependent samples, and loss functions, including least squares and generalized linear models. Gaussian width, a geometric measure of size of sets, and associated tools play a key role in our generalized analysis. |
| Researcher Affiliation | Academia | Department of Computer Science & Engineering University of Minnesota, Twin Cities {banerjee,shengc,farideh,sivakuma}@cs.umn.edu |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. |
| Open Datasets | No | The paper is theoretical and does not use or specify publicly available datasets for experimental training. |
| Dataset Splits | No | The paper is theoretical and does not describe experimental dataset splits for validation or training. |
| Hardware Specification | No | The paper is theoretical and does not describe hardware used for experiments. |
| Software Dependencies | No | The paper is theoretical and does not specify software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe specific experimental setup details, hyperparameters, or training configurations. |