Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design
Authors: Adam Foster, Desi R Ivanova, Ilyas Malik, Tom Rainforth
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We apply DAD to a range of problems relevant to applications such as epidemiology, physics and psychology. We find that DAD is able to accurately amortize experiments, opening the door to running adaptive BOED in real time. |
| Researcher Affiliation | Academia | 1Department of Statistics, University of Oxford, UK 2Work undertaken whilst at the University of Oxford. |
| Pseudocode | Yes | Algorithm 1 Deep Adaptive Design (DAD) Input: Prior p(θ), likelihood p(y|θ, ξ), number of steps T Output: Design network πφ |
| Open Source Code | Yes | Code is publicly available at https://github.com/ae-foster/dad. |
| Open Datasets | No | The paper describes generating training examples (e.g., "We generate 200,000 training examples") rather than using a publicly available, pre-existing dataset with specific access information (URL, citation with authors/year, or repository). |
| Dataset Splits | No | We do not use an explicit validation set during training. |
| Hardware Specification | Yes | All experiments were run on a 2.3 GHz 8-Core Intel Core i9 processor with 16GB of DDR4 memory. For the location finding and hyperbolic discounting experiments, the training was conducted on a single NVIDIA Quadro RTX 4000 GPU. |
| Software Dependencies | No | We implement DAD by extending Py Torch (Paszke et al., 2019) and Pyro (Bingham et al., 2018). While these are mentioned, specific version numbers are not provided. |
| Experiment Setup | Yes | We use the Adam optimizer (Kingma & Ba, 2014) with a learning rate of 0.001. For the training of DAD we used L=30 contrastive samples... We generate 200,000 training examples (batches of 128) for the location finding problem and 50,000 for the hyperbolic discounting problem. |