Approximate message passing for amplitude based optimization

Authors: Junjie Ma, Ji Xu, Arian Maleki

ICML 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We now provide simulation results to verify our analysis and compare AMP.A in (6) with existing algorithms.
Researcher Affiliation Academia 1Department of Statistics, Columbia University, New York, USA 2Department of Computer Science, Columbia University, New York, USA 3Department of Statistics, Columbia University, New York, USA.
Pseudocode Yes Starting from an initial estimate x0 Cn 1, AMP.A proceeds as follows for t 0: pt = Axt λt 1 δ g(pt 1, y), xt+1 = λt xt + AH g(pt, y), λt = divp(gt) divp(gt) + µ τt + 1 δ τ t 1 + 1 2 divp(gt 1) λt 1.
Open Source Code No No explicit statement or link indicating that the source code for the methodology described in this paper is publicly available.
Open Datasets No The true signal is generated as x CN(0, I). The nonnegative signal is generated in the following way: we set 90% of the entries to be zero and remaining 10% to be constants.
Dataset Splits No The paper does not explicitly provide training/test/validation dataset splits. It uses synthetically generated data without specifying such splits.
Hardware Specification No No specific hardware details (e.g., GPU/CPU models, memory, or cloud instance types) used for running the experiments are mentioned.
Software Dependencies No No specific software components with version numbers (e.g., libraries, frameworks, or programming language versions) are provided.
Experiment Setup Yes In these experiments, n = 5000 and m = 20000. We fix n = 1000 and vary δ. All algorithms have run 1000 iterations. Reconstruction is considered successful if the final AMSE is smaller than 10 10. The success rates are measured in 100 independent realizations of A and x . The signal-to-noise ratio (SNR) is defined to be E[ Ax 2]/E[ w 2].