Hierarchical classification at multiple operating points
Authors: Jack Valmadre
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on the i Nat21 [36] and Image Net-1k [10] datasets for image classification reveal that a naïve flat softmax classifier dominates the more elegant top-down classifiers, obtaining better accuracy at any operating point. |
| Researcher Affiliation | Academia | Jack Valmadre Australian Institute for Machine Learning, University of Adelaide jack.valmadre@adelaide.edu.au |
| Pseudocode | No | The paper describes algorithms verbally and mathematically, but does not include any formal pseudocode or algorithm blocks. |
| Open Source Code | Yes | Code is available online at https://github.com/jvlmdr/hiercls. |
| Open Datasets | Yes | Experiments on the i Nat21 [36] and Image Net-1k [10] datasets for image classification reveal that a naïve flat softmax classifier dominates the more elegant top-down classifiers, obtaining better accuracy at any operating point. |
| Dataset Splits | No | The paper mentions the 'i Nat21 validation set' but does not provide explicit percentages or sample counts for training, validation, and test splits. |
| Hardware Specification | Yes | Most experiments were conducted on a single machine with one Nvidia A6000 GPU. ... To obtain error-bars, we used a larger, shared machine with 16 Nvidia V100 GPUs (still using one GPU to train each model). |
| Software Dependencies | No | The code was implemented using the Pytorch library [28], but no specific version number for Pytorch or other software dependencies is provided. |
| Experiment Setup | Yes | For all i Nat21 experiments, we use a Res Net-18 model [19] with input images of size 224 224. We start from the Pytorch Image Net-pretrained checkpoint [28] and train for 20 epochs using SGD with momentum 0.9, cosine schedule [24], batch size 64, initial learning rate 0.01 and weight decay 0.0003. |