Hierarchical Novelty Detection via Fine-Grained Evidence Allocation
Authors: Spandan Pyakurel, Qi Yu
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments conducted on real-world hierarchical datasets demonstrate the proposed model outperforms the strongest baselines and achieves the best HND performance. |
| Researcher Affiliation | Academia | 1Rochester Institute of Technology, Rochester, New York. Correspondence to: Qi Yu <qi.yu@rit.edu>. |
| Pseudocode | Yes | Algorithm 1 E-HND Training |
| Open Source Code | Yes | The source code can be accessed here: https://github.com/ritmininglab/EHND |
| Open Datasets | Yes | Tiny Imagenet (Le & Yang, 2015): It contains 200 classes each with 500 training, 50 validation, and 50 test images in each class, resulting in a total of 120k images. |
| Dataset Splits | Yes | Tiny Imagenet (Le & Yang, 2015): It contains 200 classes each with 500 training, 50 validation, and 50 test images in each class, resulting in a total of 120k images. |
| Hardware Specification | Yes | All the experiments are conducted using NVIDIA Ge Force RTX 3060 with 32GB memory. |
| Software Dependencies | Yes | The training algorithm is implemented in pytorch version: 1.13.0 and cuda version: 11.6. |
| Experiment Setup | Yes | We train the model using the full batch of Resnet-101 features with Adam optimizer and an initial learning rate of 10 2. We use the validation set to select the suitable hyperparameters β1 and β2. The validation set does not include samples from the novel classes. We use the set of (β1, β2) values of (65, 20), (30, 5), (20, 5) and (40, 5) for CUB, AWA2, Tiny Imagenet and Traffic respectively. |