Extended and Unscented Kitchen Sinks

Authors: Edwin Bonilla, Daniel Steinberg, Alistair Reid

ICML 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We also evaluate our methods at a larger scale on MNIST and on a seismic inversion which is inherently a multi-task problem.Our experiments on small-scale synthetic nonlinear inversion tasks and on a classification task on the USPS dataset show that random feature approximations to the EGP and the UGP can attain similar performance to the original methods.
Researcher Affiliation Collaboration Edwin V. Bonilla E.BONILLA@UNSW.EDU.AU The University of New South Wales Daniel Steinberg DANIEL.STEINBERG@NICTA.COM.AU NICTA Alistair Reid ALISTAIR.REID@NICTA.COM.AU NICTA
Pseudocode No The paper does not contain any pseudocode or clearly labeled algorithm blocks.
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository.
Open Datasets Yes Furthermore, experiments at a larger scale on MNIST show that our algorithms are competitive with recently developed approaches for inference in GP models... This is a binary classification task to distinguish between images of the handwritten digits 3 and 5 in the USPS digits datasets (Rasmussen & Williams, 2006). Our dataset is part of a real seismic survey of the Otway basin region in Victoria, Australia.
Dataset Splits Yes We test our algorithms and the baselines (UGP, EGP) with five simple forward models; an identity function (linear), a 3rd order polynomial with no cross terms (poly3), an exponential function, a sinusoid, and a tangent function. We present the results of 5-fold cross validation (200 training, 800 testing) in Table 1... Here we present results on a larger application on the MNIST dataset, which contains examples of handwritten digits, 50,000 for training, 10,000 for validation and 10,000 for testing. In our experiments, we always train on 60,000 examples that include the training and the validation set and tune the parameters of our models via optimization of the variational bound.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory) used for running the experiments.
Software Dependencies No The paper does not provide specific software dependencies with version numbers.
Experiment Setup Yes In this experiment we generate latent function values (f) from a GP with isotropic squared exponential covariance function (having a signal variance σ2 s = 0.82 and a lengthscale ℓ= 0.6) at 1000 input points, x R, which are uniformly spaced between [ 2π, 2π]. We test our algorithms and the baselines (UGP, EGP) with five simple forward models; an identity function (linear), a 3rd order polynomial with no cross terms (poly3), an exponential function, a sinusoid, and a tangent function. We use a logistic sigmoid as a forward model in this task and the same settings as in the original experiments for covariance functions and observation variance.