Multi-Source Neural Variational Inference
Authors: Richard Kurle, Stephan Günnemann, Patrick van der Smagt4114-4121
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We visualise learned beliefs on a toy dataset and evaluate our methods for learning shared representations and structured output prediction, showing trade-offs of learning separate encoders for each information source. Furthermore, we demonstrate how conflict detection and redundancy can increase robustness of inference in a multi-source setting. |
| Researcher Affiliation | Collaboration | Richard Kurle Department of Informatics Technical University of Munich, Data:Lab, Volkswagen Group 80805 Munich, Germany richard.kurle@tum.de Stephan G unnemann Department of Informatics Technical University of Munich guennemann@in.tum.de Patrick van der Smagt Data:Lab, Volkswagen Group 80805 Munich, Germany |
| Pseudocode | No | The paper does not contain pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statements about open-source code availability or links to code repositories. |
| Open Datasets | Yes | We created 3 variants of MNIST (Lecun et al. 1998); Caltech-UCSD Birds 200 (Welinder et al. 2010) |
| Dataset Splits | No | The paper specifies training and test splits for the Caltech-UCSD Birds 200 dataset, but does not explicitly mention a separate validation split or its size for any dataset used. |
| Hardware Specification | No | The paper does not provide any specific hardware details used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software dependencies. |
| Experiment Setup | No | Model and algorithm hyperparameters are summarised in the supplementary material of our technical report (Kurle, G unnemann, and Smagt 2018). |