Partial Hard Thresholding: Towards A Principled Analysis of Support Recovery

Authors: Jie Shen, Ping Li

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on the simulated data complement our theoretical findings and also illustrate the effectiveness of PHT.
Researcher Affiliation Academia Jie Shen Department of Computer Science School of Arts and Sciences Rutgers University New Jersey, USA js2007@rutgers.edu; Ping Li Department of Statistics and Biostatistics Department of Computer Science Rutgers University New Jersey, USA pingli@stat.rutgers.edu
Pseudocode No The paper describes the PHT(r) algorithm using mathematical equations (zt = xt 1 η F(xt 1), yt = PHTk zt; St 1, r , St = supp yt , xt = arg min x Rd F(x), s.t. supp (x) St.) but does not present them in a clearly labeled
Open Source Code No The paper does not contain any explicit statement about releasing source code or provide a link to a code repository.
Open Datasets No We consider the compressed sensing model y = A x+0.01e, where the dimension d = 200 and the entries of A and e are i.i.d. normal variables. Given a sparsity level s, we first uniformly choose the support of x, and assign values to the non-zeros with i.i.d. normals.
Dataset Splits No The paper describes the generation of simulated data for experiments but does not mention specific train/validation/test dataset splits from a pre-existing or publicly available dataset.
Hardware Specification No The paper does not provide any specific details about the hardware used to run the experiments.
Software Dependencies No The paper does not mention any specific software dependencies or their version numbers used for the experiments.
Experiment Setup Yes The step size η is fixed to be the unit, though one can tune it using cross-validation for better performance.