One-Step Estimator for Permuted Sparse Recovery

Authors: Hang Zhang, Ping Li

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical experiments are presented to corroborate our theoretical claims. This section presents the numerical experiments to verify our main theorem, to put it more specifically, Theorem 3: we would like to prove the correct permutation can be obtained, i.e., { opt = \}, with n p and SNR c.
Researcher Affiliation Industry Hang Zhang Ping Li Amazon Linked In Ads 410 Terry Ave N, Seattle, WA 98109, USA 700 Bellevue Way NE, Bellevue, WA 98004, USA hagzhang@amazon.com pinli@linkedin.com
Pseudocode Yes Algorithm 1 One-step estimator. Input: observation Y and sensing matrix X. Output: pair ( opt, Bopt), which is written as opt = argmax 2Pn , Y thres(X>Y)> X> Bopt = argmin B(2n) 1$$$ $$$ opt>Y XB F + \nk Bk1, where thres( ) applies to each column and thresholds all entries to zero except the one with the largest magnitude, Pn denotes the set of all possible permutation matrices, k k1 , P i,j |( )i,j| denotes the absolute sum of all entries, and \n > 0 is some regularizer coefficient.
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository for the methodology described.
Open Datasets Yes This subsection evaluates our algorithm on the real-world dataset, namely, MNIST dataset (Le Cun et al., 1998).
Dataset Splits No The paper mentions using 'training' and 'test' sets from the MNIST dataset but does not provide specific details on the dataset splits (e.g., percentages, sample counts, or explicit standard split references) beyond referring to pre-existing MNIST sets.
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models, memory, or computational resources used for running the experiments.
Software Dependencies No The paper does not specify any software dependencies with version numbers, such as programming languages, libraries, or frameworks used for implementation.
Experiment Setup Yes We let Xij i.i.d N(0, 1) and pick the sample number n to be {100, 150} and set h = n/4. We vary the signal length p to be {500, 600}. Then we set the sparsity number k within the region {10, 15, 20}. And the stable rank srank(B\) is within the range {150, 200, 250}. Setting \n in (3) as c2σ log p/n