Position: Bayesian Deep Learning is Needed in the Age of Large-Scale AI

Authors: Theodore Papamarkou, Maria Skoularidou, Konstantina Palla, Laurence Aitchison, Julyan Arbel, David Dunson, Maurizio Filippone, Vincent Fortuin, Philipp Hennig, José Miguel Hernández-Lobato, Aliaksandr Hubin, Alexander Immer, Theofanis Karaletsos, Mohammad Emtiyaz Khan, Agustinus Kristiadi, Yingzhen Li, Stephan Mandt, Christopher Nemeth, Michael A Osborne, Tim G. J. Rudner, David Rügamer, Yee Whye Teh, Max Welling, Andrew Gordon Wilson, Ruqi Zhang

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical This paper posits that BDL can elevate the capabilities of deep learning. It revisits the strengths of BDL, acknowledges existing challenges, and highlights some exciting research avenues aimed at addressing these obstacles.
Researcher Affiliation Collaboration 1Department of Mathematics, The University of Manchester, Manchester, UK. 2Eric and Wendy Schmidt Center, Broad Institute of MIT and Harvard, Cambridge, USA. 3Spotify, London, UK. 4Computational Neuroscience Unit, University of Bristol, Bristol, UK. 5Centre Inria de l Universit e Grenoble Alpes, Grenoble, France. 6Department of Statistical Science, Duke University, USA. 7Statistics Program, KAUST, Saudi Arabia. 8Helmholtz AI, Munich, Germany. 9Department of Computer Science, Technical University of Munich, Munich, Germany. 10Munich Center for Machine Learning, Munich, Germany. 11T ubingen AI Center, University of T ubingen, T ubingen, Germany. 12Department of Engineering, University of Cambridge, Cambridge, UK. 13Department of Mathematics, University of Oslo, Oslo, Norway. 14Bioinformatics and Applied Statistics, Norwegian University of Life Sciences, As, Norway. 15Department of Computer Science, ETH Zurich, Switzerland. 16Chan Zuckerberg Initiative, California, USA. 17Center for Advanced Intelligence Project, RIKEN, Tokyo, Japan. 18Vector Institute, Toronto, Canada. 19Department of Computing, Imperial College London, London, UK. 20Department of Computer Science, UC Irvine, Irvine, USA. 21Department of Mathematics and Statistics, Lancaster University, Lancaster, UK. 22Department of Engineering Science, University of Oxford, Oxford, UK. 23Center for Data Science, New York University, New York, USA. 24Department of Statistics, LMU Munich, Munich, Germany. 25Deep Mind, London, UK. 26Department of Statistics, University of Oxford, Oxford, UK. 27Informatics Institute, University of Amsterdam, Amsterdam, Netherlands. 28Courant Institute of Mathematical Sciences and Center for Data Science, Computer Science Department, New York University, New York, USA. 29Department of Computer Science, Purdue University, West Lafayette, USA.
Pseudocode No The paper contains mathematical equations and descriptions of methods, but no pseudocode or algorithm blocks.
Open Source Code No The paper is a position paper and does not describe a novel methodology or implement new code for its own work that would be open-sourced. It mentions existing BDL software efforts in Appendix C, but this is not code for the paper's own methodology.
Open Datasets No The paper is a position paper and does not conduct its own experiments with specific datasets; therefore, no public dataset access information for its own work is provided.
Dataset Splits No The paper is a position paper and does not conduct its own experiments; therefore, no training/validation/test splits are provided for its own work.
Hardware Specification No The paper is a position paper and does not conduct its own experiments; therefore, no hardware specifications are provided for its own work.
Software Dependencies No The paper is a position paper and does not conduct its own experiments; therefore, no specific software dependencies with version numbers are provided for its own work.
Experiment Setup No The paper is a position paper and does not conduct its own experiments; therefore, no experimental setup details like hyperparameters or training settings are provided for its own work.