The Implicit Delta Method

Authors: Nathan Kallus, James McInerney

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we evaluate finite-difference implicit delta method (FDIDM) on a range of tasks that require confidence intervals.3 Our goal is to quantify the extent to which FDIDM applies in practice and how it compares to alternative methods. We start with 1D synthetic data in Sec. 5.1 where we apply a neural net to recover known functions from small datasets. Then, in Sec. 5.2, we consider the task of inferring average utility under a neural net trained on a set of real-world benchmark datasets. In Sec. 5.3, we apply FDIDM to variational autoencoders and use the implicit delta perspective to understand the effect of KL down-weighting. We find that the motivation and convergence properties of FDIDM are empirically observed and this may be useful to practitioners seeking to quantify the epistemic uncertainty of complex models on a variety of regression and classification tasks.
Researcher Affiliation Collaboration Nathan Kallus Cornell University & Netflix Research kallus@cornell.edu James Mc Inerney Netflix Research jmcinerney@netflix.com
Pseudocode Yes Algorithm 1: Finite-difference implicit delta method (FDIDM)
Open Source Code Yes The source code is available at https://github.com/jamesmcinerney/implicit-delta.
Open Datasets Yes We show both methods on MNIST image classification [23] and a set of UCI benchmark datasets [11] in Fig. 4.
Dataset Splits No In this setting, we wish to calculate confidence intervals over total cost in a downstream task under predictions from the network. An arbitrary cost function is set up, in this case, the average cross entropy of the observations on a held-out validation dataset...
Hardware Specification Yes Run time was measured on a Mac Book Pro 2.3 GHz Quad-Core Intel Core i7 with 32 GB RAM.
Software Dependencies No The paper mentions using neural networks and various models but does not specify software dependencies like libraries or frameworks with version numbers (e.g., PyTorch, TensorFlow versions).
Experiment Setup Yes Results for IDM, DM, and simulation are all based on estimates using a neural net with 1 hidden layer of 50 tanh units.