Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

A Non-Asymptotic Analysis for Stein Variational Gradient Descent

Authors: Anna Korba, Adil Salim, Michael Arbel, Giulia Luise, Arthur Gretton

NeurIPS 2020 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We illustrate the validity of the rates of Corollary 6 with simple experiments provided Section 13.
Researcher Affiliation Academia Anna Korba Gatsby Computational Neuroscience Unit University College London EMAIL Adil Salim Visual Computing Center KAUST EMAIL Michael Arbel Gatsby Computational Neuroscience Unit University College London EMAIL Giulia Luise Computer Science Department University College London EMAIL Arthur Gretton Gatsby Computational Neuroscience Unit University College London EMAIL
Pseudocode No The paper does not contain a pseudocode block or a clearly labeled algorithm block.
Open Source Code No The paper does not contain any statement about open-source code availability or links to code repositories.
Open Datasets No The paper mentions "toy experiments are deferred to the appendix" but does not specify any publicly available datasets used for training in the main text.
Dataset Splits No The paper does not provide specific training/test/validation dataset splits. It mentions "toy experiments" but defers details to the appendix.
Hardware Specification No The paper does not explicitly describe the hardware used to run its experiments.
Software Dependencies No The paper does not provide a reproducible description of ancillary software with specific version numbers.
Experiment Setup No The paper mentions "toy experiments are deferred to the appendix" but does not provide specific details about the experimental setup, hyperparameters, or system-level training settings in the main text.