Natural-Parameter Networks: A Class of Probabilistic Neural Networks

Authors: Hao Wang, Xingjian SHI, Dit-Yan Yeung

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on real-world datasets show that NPN can achieve state-of-the-art performance.
Researcher Affiliation Academia Hao Wang, Xingjian Shi, Dit-Yan Yeung Hong Kong University of Science and Technology {hwangaz,xshiab,dyyeung}@cse.ust.hk
Pseudocode Yes Algorithm 1 Deep Nonlinear NPN
Open Source Code No The paper states, 'We use Matlab (with GPU) to implement NPN, AE variants, and the vanilla NN trained with dropout SGD (dropout NN).' However, no specific link or statement regarding the public availability of their NPN source code is provided.
Open Datasets Yes The MNIST digit dataset consists of 60,000 training images and 10,000 test images. Three real-world datasets, Citeulike-a, Citeulike-t, and ar Xiv, are used. The first two datasets are from [22, 23], collected separately from Cite ULike in different ways to mimic different real-world settings. The third one is from ar Xiv as one of the SNAP datasets [15].
Dataset Splits Yes We train the models with 50,000 images and use 10,000 images for validation.
Hardware Specification No The paper mentions using 'Matlab (with GPU)' but does not specify any particular GPU model, CPU, or other detailed hardware specifications for the experiments.
Software Dependencies No The paper mentions using 'Matlab', 'Theano library [2]', and 'MXNet [5]' but does not provide specific version numbers for any of these software dependencies.
Experiment Setup No The paper states, 'We implement BDK and NPN using the same hyperparameters as in [1] whenever possible.' and 'Gaussian priors are used for NPN (see the supplementary material for detailed hyperparameters).' It refers to external sources or supplementary material for hyperparameter details, rather than providing them explicitly in the main text.