On the consistency theory of high dimensional variable screening

Authors: Xiangyu Wang, Chenlei Leng, David B. Dunson

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This article studies a class of linear screening methods and establishes consistency theory for this special class. In particular, we prove the restricted diagonally dominant (RDD) condition is a necessary and sufficient condition for strong screening consistency. As concrete examples, we show two screening methods SIS and HOLP are both strong screening consistent (subject to additional constraints) with large probability if n > O((ρs+σ/τ)2 log p) under random designs.
Researcher Affiliation Academia Xiangyu Wang Dept. of Statistical Science Duke University, USA xw56@stat.duke.edu; Chenlei Leng Dept. of Statistics University of Warwick, UK C.Leng@warwick.ac.uk; David B. Dunson Dept. of Statistical Science Duke University, USA dunson@stat.duke.edu
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide any statement or link indicating the release of open-source code for the methodology described.
Open Datasets No The paper is theoretical and focuses on consistency theory under random designs (e.g., Gaussian distribution for X and ϵ). It does not use specific publicly available datasets for training.
Dataset Splits No The paper is theoretical and does not present empirical experiments with training, validation, or test dataset splits.
Hardware Specification No The paper is theoretical and does not mention any specific hardware used for experiments.
Software Dependencies No The paper is theoretical and does not mention any specific software dependencies or versions.
Experiment Setup No The paper is theoretical and does not describe an experimental setup with hyperparameters or system-level training settings.