Block Sparse Bayesian Learning: A Diversified Scheme

Authors: Yanhao Zhang, Zhihan Zhu, Yong Xia

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments validate the advantages of Div SBL over existing algorithms. In this section, we compare Div SBL with the following six algorithms:4 1. Block-based algorithms: (1) BSBL, (2) Group Lasso, (3) Group BPDN. 2. Pattern-based algorithms: (4) PC-SBL, (5) Struct OMP. 3. Sparse learning (without structural information): (6) SBL. Results are averaged over 100 or 500 random runs (based on computational scale), with SNR ranging from 15-25 d B except the test for varied noise levels. Normalized Mean Squared Error (NMSE) , defined as ||ˆx xtrue||2 2/||xtrue||2 2, and Correlation (Corr) (cosine similarity) are used to compare algorithms.
Researcher Affiliation Academia Yanhao Zhang Zhihan Zhu Yong Xia School of Mathematical Sciences, Beihang University Beijing, 100191 {yanhaozhang, zhihanzhu, yxia}@buaa.edu.cn
Pseudocode Yes In conclusion, the Diversified SBL (Div SBL) algorithm is summarized as Algorithm 1 below. The procedure, using dual ascent method to diversify Bi, is summarized in Algorithm 2 as follows:
Open Source Code Yes Matlab codes for our algorithm are available at https://github.com/Yanhao Zhang1/Div SBL .
Open Datasets Yes We initially test on synthetic signal data, including homoscedastic (provided by [24]) and heteroscedastic data... randomly chosen in Audio Set [34]... In 2D image experiments, we utilize a standard set of grayscale images compiled from two sources 6. Available at http://dsp.rice.edu/software/DAMP-toolbox and http://see.xidian.edu.cn/faculty/wsdong/NLR_Exps.htm
Dataset Splits No The paper does not explicitly provide details about a validation dataset split (e.g., percentages, sample counts, or methodology for a dedicated validation set).
Hardware Specification No The paper mentions 'CPU time' in Appendix D but does not provide specific hardware details such as GPU/CPU models, processor types, or memory amounts used for running experiments.
Software Dependencies No Matlab codes for our algorithm are available at https://github.com/Yanhao Zhang1/Div SBL . While implying the use of Matlab, the paper does not specify version numbers for Matlab or any other key software components used in the experiments.
Experiment Setup Yes Results are averaged over 100 or 500 random runs (based on computational scale), with SNR ranging from 15-25 d B except the test for varied noise levels. The sensitivity to initialization on the heteroscedastic signal from Section 5.1. Initial variances are set to γ = η ones(g L, 1) and γ = η rand(g L, 1) with the scale parameter η ranging from 1 10 1 to 1 104. Input: Measurement matrix Φ, response y, initialized variance γ, prior s covariance Σ0, noise s variance β, and multipliers λ0.