Variational Inference in Mixed Probabilistic Submodular Models

Authors: Josip Djolonga, Sebastian Tschiatschek, Andreas Krause

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the effectiveness of our approach in a large set of experiments, where our model allows reasoning about preferences over sets of items with complements and substitutes. 5 Experiments
Researcher Affiliation Academia Josip Djolonga Sebastian Tschiatschek Andreas Krause Department of Computer Science, ETH Z urich {josipd,tschiats,krausea}@inf.ethz.ch
Pseudocode Yes Algorithm 1 Modular upper bound for M -concave functions
Open Source Code No The paper does not provide an explicit statement or link for open-source code availability for the described methodology.
Open Datasets Yes Dataset. We use the Amazon baby registry dataset [21] for evaluating our proposed variational inference scheme.
Dataset Splits Yes We then used the trained models for the product recommendation task from the previous section and estimated the performance metrics using 10-fold cross-validation.
Hardware Specification No The paper does not provide specific hardware details used for running experiments.
Software Dependencies No The paper does not specify software dependencies with version numbers.
Experiment Setup Yes We used stochastic gradient descent for optimizing the NCE objective, created 200.000 noise samples from the modular model and made 100 passes through the data and noise samples. We used K = 10, L = 10 dimensions for the weights (if applicable for the corresponding model).