Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Swept Approximate Message Passing for Sparse Estimation

Authors: Andre Manoel, Florent Krzakala, Eric Tramel, Lenka Zdeborovà

ICML 2015 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our results show that this change to the AMP iteration can provide expected, but hitherto unobtainable, performance for problems on which the standard AMP iteration diverges. Additionally, we find that the computational costs of this swept coefficient update scheme is not unduly burdensome, allowing it to be applied efficiently to signals of large dimensionality.
Researcher Affiliation Academia Andre Manoel EMAIL Institute of Physics, University of S ao Paulo, R. do Mat ao 187, S ao Paulo, SP 05508-090, Brazil Florent Krzakala EMAIL Universit e Pierre et Marie Curie and Ecole Normale Sup erieure, 24 rue Lhomond, 75005 Paris, France Eric W. Tramel EMAIL Ecole Normale Sup erieure, 24 rue Lhomond, 75005 Paris, France Lenka Zdeborov a EMAIL Institut de Physique Th eorique, CEA Saclay, and CNRS URA 2306, 91191 Gif-sur-Yvette, France
Pseudocode Yes Algorithm 1 Swept AMP
Open Source Code Yes We have provided demonstrations of the Sw AMP code on-line https://github.com/eric-tramel/SwAMP-Demo
Open Datasets No The paper describes generating synthetic data for its experiments (e.g., 'we draw i.i.d. projections according to...', 'draw N PQ') rather than using pre-existing publicly available datasets with concrete access information.
Dataset Splits No The paper does not explicitly describe training/validation/test dataset splits. It discusses algorithm iterations and convergence but not data partitioning for these purposes.
Hardware Specification Yes All experiments were conducted on a computer with an i7-3930K processor and run via Matlab.
Software Dependencies No The paper mentions 'Matlab' but does not provide a version number. While specific toolboxes like 'SPGL1', 'Sparse Reg Matlab toolbox', and 'Spa SM Matlab toolbox' are mentioned, their explicit version numbers are not provided in the text.
Experiment Setup Yes For these tests, we draw N PQ, where Pµk, Qki N(0, 1) and R ηN. In our experiments, we use η to denote the level of independence of the rows of Φ, with lower values of η representing a more difficult problem. The tests are conducted over 500 independent realizations of the sparse reconstruction problem for N = 1024, α = 0.6, and ρ = 0.2 with a noise variance = 10 8. For the implemenation of the ℓp regression we utilized the Sparse Reg Matlab toolbox (Zhou, 2013), while we use the Spa SM Matlab toolbox (Sj ostrand et al., 2012) for the implementation of adaptive Lasso. Additionally, for adaptive Lasso we choose a weight exponent of 0.1.