VRA: Variational Rectified Activation for Out-of-distribution Detection
Authors: Mingyu Xu, Zheng Lian, Bin Liu, Jianhua Tao
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on multiple benchmark datasets demonstrate that our method outperforms existing post-hoc strategies. |
| Researcher Affiliation | Academia | 1The State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences 2School of Artificial Intelligence, University of Chinese Academy of Sciences 3Department of Automation, Tsinghua University 4Beijing National Research Center for Information Science and Technology, Tsinghua University |
| Pseudocode | No | The paper describes mathematical functions and operations but does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Our code is available at https://github.com/zeroQiaoba/VRA. |
| Open Datasets | Yes | For CIFAR benchmarks [11] as ID data... select six datasets as OOD data: Textures [12], SVHN [13], Places365 [14], LSUN-Crop [15], LSUN-Resize [15], and i SUN [16]; for Image Net [17] as ID data... OOD data... i Naturalist [18], SUN [19], Places [14], and Textures [12]. |
| Dataset Splits | Yes | Consistent with previous works [8], we use Gaussian noise images as the validation set for hyperparameter tuning. |
| Hardware Specification | Yes | All experiments are implemented with Py Torch [30] and carried out with NVIDIA Tesla V100 GPU. |
| Software Dependencies | No | The paper mentions 'Py Torch [30]' as an implementation framework, but it does not specify a version number for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | Our method contains three user-specific parameters: the thresholds ηα and ηβ, and the degree of amplification γ. We select ηα from {0.5, 0.6, 0.65, 0.7}, ηβ from {0.8, 0.85, 0.9, 0.95, 0.99}, and γ from {0.2, 0.3, 0.4, 0.5, 0.6, 0.7}. |