An Iterative Min-Min Optimization Method for Sparse Bayesian Learning

Authors: Yasen Wang, Junlin Li, Zuogong Yue, Ye Yuan

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the proposed novel SBL algorithm on simulation and real-world problems to validate its capability for sparse recovery from overcomplete dictionaries, including sparse signal recovery, system identification, and sparse kernel regression. Benchmarked against classical SBL algorithms, experimental results illustrate its superior performance in finding sparse solutions.
Researcher Affiliation Academia 1School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan, China 2State Key Lab of Digital Manufacturing Equipment and Technology, Huazhong University of Science and Technology, Wuhan, China 3School of Mathematics and Statistics, Fuyang Normal University, Fuyang, China 4School of Artificial Intelligence and Automation, Huazhong University of Science and Technology, Wuhan, China.
Pseudocode Yes Algorithm 1 Iterative Min-Min optimization method
Open Source Code Yes Source codes are available on Git Hub at https://github.com/Arthin YS/Min Min SBL.
Open Datasets Yes Finally, we apply all the SBL algorithms to discover the chaotic Lorenz system from the overcomplete dictionary Φ65 95. ... Finally, we apply all the SBL algorithms to kernel regression models on the Red Wine Quality dataset, which contains the information concerning wine quality.
Dataset Splits No In the experiment, we split 1599 data points into 1000 for training and 599 for testing.
Hardware Specification Yes The experiments are conducted using MATLAB 2022b on the PC with an Apple M1 Pro chip with 10-core CPU and 32 GB RAM.
Software Dependencies Yes The experiments are conducted using MATLAB 2022b on the PC with an Apple M1 Pro chip with 10-core CPU and 32 GB RAM.
Experiment Setup Yes In the experiments, we set n = 60, m = 100, and k = 4, and conduct simulation trials with SNR values ranging from 0 to 35 d B in steps of 5 d B. In all cases, we run 100 independent trials to test the performance of all the SBL algorithms. Additionally, a successful trial is recorded if the indices of nonzero elements in the estimated vector w are the same as true indices. ... We use the MATLAB function ode45 to solve the system with the initial condition [x, y, z]T = [ 8, 7, 27]T and obtain data with a time step of 0.001 over the time interval [0, 65]. Then, we uniformly sub-sample 65 data points from the collected data. ... In the experiment, we split 1599 data points into 1000 for training and 599 for testing. The basis functions include kernel functions and a constant term. Particularly, we consider four different kernel functions: linear, Mat ern-3/2, exponential, and Gaussian kernels. The hyperparameters in the kernels are set to 1 by default.