TransHP: Image Classification with Hierarchical Prompting
Authors: Wenhao Wang, Yifan Sun, Wei Li, Yi Yang
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments show that Trans HP improves image classification on accuracy (e.g., improving Vi T-B/16 by +2.83% Image Net classification accuracy), training data efficiency (e.g., +12.69% improvement under 10% Image Net training data), and model explainability. Moreover, Trans HP also performs favorably against prior HIC methods, showing that Trans HP well exploits the hierarchical information. |
| Researcher Affiliation | Collaboration | Wenhao Wang Re LER, University of Technology Sydney Yifan Sun Baidu Inc. Wei Li Zhejiang University Yi Yang Zhejiang University |
| Pseudocode | No | The paper describes the model architecture and steps but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code is available at: https://github.com/Wang Wenhao0716/Trans HP. |
| Open Datasets | Yes | We evaluate the proposed Trans HP on five datasets with hierarchical labels, i.e., Image Net [10], i Naturalist-2018 [11], i Naturalist-2019 [11], CIFAR-100 [25], and Deep Fashion-inshop [26]. |
| Dataset Splits | Yes | To this end, we randomly select 1/10, 1/5, and 1/2 training data from each class in Image Net (while keeping the validation set untouched). |
| Hardware Specification | Yes | We train it for 300 epochs on 8 Nvidia A100 GPUs and Py Torch. |
| Software Dependencies | No | The paper mentions 'Py Torch' as a software dependency but does not specify its version number or any other software dependencies with version numbers. |
| Experiment Setup | Yes | The base learning rate is 0.001 with cosine learning rate. We set the batch size, the weight decay and the number of warming up epochs as 1,024, 0.05 and 5, respectively. |