GFlowOut: Dropout with Generative Flow Networks

Authors: Dianbo Liu, Moksh Jain, Bonaventure F. P. Dossou, Qianli Shen, Salem Lahlou, Anirudh Goyal, Nikolay Malkin, Chris Chinenye Emezue, Dinghuai Zhang, Nadhir Hassen, Xu Ji, Kenji Kawaguchi, Yoshua Bengio

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental To investigate the quality of the posterior distribution learned by GFlow Out, we design empirical experiments, including evaluating robustness to distribution shift during inference, detecting out-of-distribution examples with uncertainty estimates, and transfer learning, using both benchmark datasets and a real-world clinical dataset.
Researcher Affiliation Collaboration 1Mila Quebec AI Institute 2Broad Institute of MIT and Harvard 3University of Montreal 4Mc Gill University 5Lelapa AI 6National University of Singapore 7Google Deep Mind 8Technical University of Munich 9CIFAR AI Chair.
Pseudocode Yes Algorithm 1 GFlow Out
Open Source Code Yes 2Code is available at https://github.com/kaiyuanmifen/GFNDropout
Open Datasets Yes We conduct experiments on MNIST, CIFAR10, and CIFAR-100 datasets with different types and levels of deformations. Records of both hospitals are obtained from the e ICU database (Pollard et al., 2018).
Dataset Splits Yes Early stopping based on performance on the validation set is used to prevent overfitting. This results in a dataset with 120945 entries, which is equally partitioned (70:30 ratio) into the real training and validation sets.
Hardware Specification Yes On a single RTX8000 GPU, training models with GFlow Out takes around the same time as Contextual dropout and Concrete Dropout, and around twice the time (Res Net 7 hrs and MCAN Transformer 16 hrs) as a model with the same architecture and random dropout.
Software Dependencies No The paper does not provide specific software dependencies with version numbers, such as 'Python 3.8' or 'PyTorch 1.9'.
Experiment Setup Yes Hyperparameters of the backbone Res Net and Transformer models were obtained from published baselines or architectures (He et al., 2016; Yu et al., 2019; Fan et al., 2021; Gal & Ghahramani, 2016; Gal et al., 2017). Several GFlow Net-specific hyperparameters are taken into consideration in this study, including the architecture of the variational function q( ) and its associated hyperparameters and the temperature of q ( ). For ID-GFlow Out, there is an additional hyperparameter, which is the prior p(z). The parameters are picked via grid search using the validation set. The temperature of q ( ) is set as 2. In addition, with a 0.1 probability, the forward policy will choose a random mask set in each layer.