Kullback-Leibler Proximal Variational Inference

Authors: Mohammad Emtiyaz Khan, Pierre Baque, François Fleuret, Pascal Fua

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We now present some results on the real data. Our goal is to show that our approach gives comparable results to existing methods and is easy to implement. We also show that, in some cases, our method is significantly faster than the alternatives due to the kernel trick. We show results on three models: Bayesian logistic regression, GP classification with logistic likelihood, and GP regression with Laplace likelihood.
Researcher Affiliation Academia Mohammad Emtiyaz Khan Ecole Polytechnique F ed erale de Lausanne Lausanne, Switzerland emtiyaz@gmail.com Pierre Baqu e Ecole Polytechnique F ed erale de Lausanne Lausanne, Switzerland pierre.baque@epfl.ch Franc ois Fleuret Idiap Research Institute Martigny, Switzerland francois.fleuret@idiap.ch Pascal Fua Ecole Polytechnique F ed erale de Lausanne Lausanne, Switzerland pascal.fua@epfl.ch
Pseudocode Yes Algorithm 1 Proximal-gradient algorithm for linear basis function models and Gaussian process
Open Source Code No The paper states: 'We implemented these methods using min Func software by Mark Schmidt2.' and provides a link to this third-party software ('https://www.cs.ubc.ca/ schmidtm/Software/min Func.html'), but does not provide access to the authors' own implementation code for the proposed method.
Open Datasets Yes These datasets can be found at the data repository1 of LIBSVM and UCI. 1https://archive.ics.uci.edu/ml/datasets.html and http://www.csie.ntu.edu.tw/ cjlin/libsvmtools/datasets/
Dataset Splits Yes To set the hyperparameter δ, we use cross-validation for MAP, and maximum marginal-likelihood estimate for the rest of the methods. Model Dataset N D %Train #Splits Hyperparameter range a1a 32,561 123 5% 1 δ = logspace(-3,1,30) Colon 62 2000 50% 10 δ = logspace(0,6,30)
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models, memory, or specific computing environments used for the experiments.
Software Dependencies No The paper mentions 'min Func software by Mark Schmidt2', 'L-BFGS', and 'GPML toolbox' but does not provide specific version numbers for any of these software dependencies.
Experiment Setup Yes We use a fixed step-size of βk = 0.25 and 1 for logistic and Laplace likelihoods, respectively. We set the Gaussian prior to Σ = δI and µ = 0. All algorithms are stopped when optimality condition is below 10^-4.