Decoupled Variational Gaussian Inference

Authors: Mohammad Emtiyaz Khan

NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the advantages of our approach on a binary GP classification problem. We model the binary data using Bernoulli-logit likelihoods. We apply this model to a subproblem of the USPS digit data [18].
Researcher Affiliation Academia Mohammad Emtiyaz Khan Ecole Polytechnique F ed erale de Lausanne (EPFL), Switzerland emtiyaz@gmail.com
Pseudocode Yes Algorithm 1 Linearly constrained Lagrangian (LCL) method for VG approximation
Open Source Code No The paper states: 'In future, we plan to have an efficient implementation of this method and demonstrate that this enables variational inference to scale to large data.', indicating future release, not current availability. No explicit link or statement of immediate code release was found.
Open Datasets Yes We apply this model to a subproblem of the USPS digit data [18].
Dataset Splits No The paper mentions using a subproblem of the USPS digit data with 'a total of 1540 data examples' and subsampling randomly, but does not specify exact training, validation, or test splits (e.g., percentages, counts, or references to predefined splits).
Hardware Specification No The paper does not provide any specific hardware details such as GPU models, CPU types, or memory specifications used for running the experiments.
Software Dependencies No The paper mentions 'L-BFGS method for optimization (implemented in min Func by Mark Schmidt)', but does not provide specific version numbers for 'minFunc' or any other software dependencies.
Experiment Setup Yes We set µ = 0 and use a squared-exponential kernel, for which the (i, j)th entry of Σ is defined as: Σij = σ2 exp[ 1 2||xi xj||2/s] where xi is i th feature. We show results for log(σ) = 4 and log(s) = 1... All algorithms were stopped when the subsequent changes in the lower bound value of Eq. 5 were less than 10 4.