Semi-supervised Learning with Deep Generative Models

Authors: Diederik P. Kingma, Shakir Mohamed, Danilo Jimenez Rezende, Max Welling

NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the performance of our approach on a number of data sets providing stateof-the-art results on benchmark problems. Table 1: Benchmark results of semi-supervised classification on MNIST with few labels.
Researcher Affiliation Collaboration Machine Learning Group, Univ. of Amsterdam, {D.P.Kingma, M.Welling}@uva.nl Google Deepmind, {danilor, shakir}@google.com
Pseudocode Yes Algorithm 1 Learning in model M1
Open Source Code Yes Open source code, with which the most important results and figures can be reproduced, is available at http://github.com/dpkingma/nips14-ssl.
Open Datasets Yes We test performance on the standard MNIST digit classification benchmark.
Dataset Splits No The paper describes how the dataset is partitioned into labeled and unlabeled sets for semi-supervised learning, and mentions a 'test-set performance' later, but does not explicitly provide details about a distinct 'validation' set split (e.g., percentages or counts for a validation set).
Hardware Specification No The paper mentions 'the Dutch national e-infrastructure' for experiments but does not provide specific hardware details such as GPU/CPU models or memory specifications.
Software Dependencies No The paper states that open source code is available but does not explicitly list specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions) in the provided text.
Experiment Setup Yes The parameters were initialized by sampling randomly from N(0, 0.0012I), except for the bias parameters which were initialized as 0. The objectives were optimized using minibatch gradient ascent until convergence, using a variant of RMSProp with momentum and initialization bias correction, a constant learning rate of 0.0003, first moment decay (momentum) of 0.1, and second moment decay of 0.001.