SiBBlInGS: Similarity-driven Building-Block Inference using Graphs across States

Authors: Noga Mudrik, Gal Mishne, Adam Shabti Charles

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate Si BBl In GS s ability to reveal insights into complex phenomena as well as its robustness to noise and missing samples through several synthetic and real-world examples, including web search and neural data.
Researcher Affiliation Academia 1Biomedical Engineering, Kavli NDI, Center for Imaging Science, The Mathematical Institute for Data Science, The Johns Hopkins University, Baltimore, MD, USA 2Halıcıo glu Data Science Institute, UCSD, San Diego, CA, USA.
Pseudocode Yes Algorithm 1 The Si BBl In GS Model (concise version)
Open Source Code Yes The code employed in this study is available on https://github.com/Noga Mudrik/Si BBl In GS.
Open Datasets Yes The data used in this study are publicly available and cited within the paper. (e.g., neural activity (by Chowdhury & Miller (2022)))
Dataset Splits Yes A k-fold cross-validation classification approach with k = 4 folds was used in a multi-class logistic regression model with multinomial loss (trained on 3 folds and used to predict the labels of the remaining fold).
Hardware Specification No The paper states, 'All experiments and code were developed and executed using Python version 3.10.4 and are compatible with standard desktop machines,' but does not provide specific hardware details like GPU/CPU models, memory, or cloud resources.
Software Dependencies No The paper mentions 'Python version 3.10.4' but does not provide version numbers for other key software components such as scikit-learn, PyLops, SPGL1, or scipy that were used.
Experiment Setup Yes The parameters for the λ update in Eq. (2) were ϵ = 0.01, β = 0.09, and wgraph = 1. For the regularization of Φ (Eq. (3)), the parameters used were γ1 = 0.1, γ2 = 0.1, γ3 = 0, and γ4 = 0.0001.