A Provable Approach for Double-Sparse Coding

Authors: Thanh Nguyen, Raymond Wong, Chinmay Hegde

AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we support our analysis via several numerical experiments on simulated data, confirming that our method can indeed be useful in problem sizes encountered in practical applications.
Researcher Affiliation Academia Thanh V. Nguyen ECE Department Iowa State University thanhng@iastate.edu Raymond K. W. Wong Statistics Department Texas A&M University raywong@tamu.edu Chinmay Hegde ECE Department Iowa State University chinmay@iastate.edu
Pseudocode Yes Algorithm 1 Truncated Pairwise Reweighting; Algorithm 2 Double-Sparse Coding Descent Algorithm
Open Source Code Yes Matlab implementation of our algorithms is available online4. 4https://github.com/thanh-isu/double-sparse-coding
Open Datasets No We generate a synthetic training dataset according to the model described in the Setup. The base dictionary Φ is the identity matrix of size n = 64 and the square synthesis matrix A is a block diagonal matrix with 32 blocks. Each 2x2 block is of form [1 1; 1 1] (i.e., the column sparsity r = 2) . The support of x is drawn uniformly over all 6-dimensional subsets of [m], and the nonzero coefficients are randomly set to 1 with equal probability.
Dataset Splits No The paper mentions 'disjoint sets P1 and P2 of sizes p1 and p2 respectively' for the initialization stage, but it does not specify explicit training/validation/test splits for the overall experimental evaluation of the model's performance.
Hardware Specification No The paper does not mention any specific hardware specifications (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions 'Matlab implementation of our algorithms' and refers to 'the implementation of Trainlets' but does not provide specific version numbers for Matlab or any other software libraries/dependencies.
Experiment Setup Yes For all the approaches except Trainlets, we use T = 2000 iterations for the initialization procedure, and set the number of steps in the descent stage to 25. ... The learning step of Trainlets is executed for 50 iterations.