Hybrid Energy Based Model in the Feature Space for Out-of-Distribution Detection

Authors: Marc Lafon, Elias Ramzi, Clément Rambour, Nicolas Thome

ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate the significance of the two contributions.
Researcher Affiliation Collaboration 1Cedric Laboratory, Cnam, Paris, France 2Coexya 3Sorbonne Universit e, CNRS, ISIR, F-75005 Paris, France.
Pseudocode Yes Algorithm 1 Hybrid Energy Based Model Training
Open Source Code Yes The code is available at: github.com/Marc Lafon/heatood.
Open Datasets Yes The two commonly used CIFAR-10 and CIFAR-100 (Krizhevsky, 2009) benchmarks as in (Sehwag et al., 2021; Sun et al., 2022). We also conduct experiments on the large-scale Imagenet (Deng et al., 2009) dataset.
Dataset Splits No The paper uses standard benchmarks like CIFAR-10, CIFAR-100, and Imagenet but does not explicitly state the train/validation/test dataset splits or reference specific predefined splits with author/year citations, beyond mentioning the datasets themselves.
Hardware Specification Yes Times are reported using RGB images of size 224 224, a Res Net-50 with an output size of 2048, 1000 classes (i.e. Imagenet setup), on a single GPU (Quadro RTX 6000 with 24576Mi B).
Software Dependencies No The paper mentions 'Py Torch' and 'timm library' but does not provide specific version numbers for these software dependencies.
Experiment Setup Yes HEAT consists in a 6 layers MLP trained for 20 epochs with Adam with learning rate 5e-6. The network input dimension is 512... The hidden dimension is 1024... For SGLD sampling, we use 20 steps with an initial step size of 1e-4 linearly decayed to 1e-5 and an initial noise scale of 5e-3 linearly decayed to 5e-4. We add a small Gaussian noise with std 1e-4... The L2 coefficient is set to 10. We use temperature scaling... with temperature TG =1e3.