Kernel Implicit Variational Inference
Authors: Jiaxin Shi, Shengyang Sun, Jun Zhu
ICLR 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We present empirical results on both synthetic and real datasets to demonstrate the benefits of KIVI. |
| Researcher Affiliation | Academia | Department of Computer Science & Technology, THU Lab for Brain and AI, Tsinghua University Department of Computer Science, University of Toronto |
| Pseudocode | Yes | Algorithm 1 Kernel Implicit Variational Inference (KIVI) and Algorithm 2 MMNN |
| Open Source Code | No | The paper mentions that 'All implementations are based on Zhu Suan (Shi et al., 2017)', which is cited as 'A library for Bayesian deep learning'. However, it does not provide a direct link or explicit statement that the code for this specific paper's methodology is open-source or available. |
| Open Datasets | Yes | We present empirical results on both synthetic and real datasets... for regression benchmarks... Boston, Concrete, Energy, Kin8nm, Naval, Combined, Protein, Wine, Yacht, Year... We conduct experiments on two widely used datasets for generative modeling: binarized MNIST and Celeb A (Liu et al., 2015). |
| Dataset Splits | Yes | We used the last 10,000 samples of the training set as the validation set for model selection. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory, or computational resources) used for running the experiments. |
| Software Dependencies | No | The paper mentions 'All implementations are based on Zhu Suan (Shi et. al., 2017)', but no version numbers are provided for Zhu Suan or any other software dependencies. |
| Experiment Setup | Yes | For all datasets, we set np = nq = M = 100, λ = 0.001 and set the batch size to 100 and the learning rate to 0.001. The model is trained for 3000 epochs for the small datasets with less than 1000 data points, and 500 epochs for the others. |