Variational Mixture of HyperGenerators for Learning Distributions over Functions

Authors: Batuhan Koyuncu, Pablo Sanchez Martin, Ignacio Peis, Pablo M. Olmos, Isabel Valera

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through experiments on a diverse range of data types, such as images, voxels, and climate data, we show that Va Mo H can effectively learn rich distributions over continuous functions.
Researcher Affiliation Academia 1Saarland University, Saarbr ucken, Germany 2Max Planck Institute for Intelligent Systems, T ubingen, Germany 3Universidad Carlos III de Madrid, Madrid, Spain.
Pseudocode Yes Algorithm 1 Minibatch training of Va Mo H
Open Source Code Yes The code with the model implementation and experiments is available at https://github.com/bkoyuncu/vamoh.
Open Datasets Yes We evaluate Va Mo H on POLYMNIST (28 28), CELEBA HQ (64 64) (Karras et al., 2017), SHAPES3D (64 64) (Burgess & Kim, 2018), climate data from the ERA5 dataset (Hersbach et al., 2019), and 3D chair voxels from the SHAPENET dataset (Chang et al., 2015).
Dataset Splits No The paper does not explicitly provide training/test/validation dataset splits (e.g., percentages or sample counts) needed to reproduce the experiment.
Hardware Specification Yes We implemented Va Mo H in Py Torch and performed all experiments on a single V100 with 32GB of RAM.
Software Dependencies No The paper mentions 'Py Torch' but does not specify its version number or versions for other key software dependencies or libraries.
Experiment Setup Yes Implementation details for Va Mo H are provided in Table 3.