Hyperbolic Knowledge Transfer with Class Hierarchy for Few-Shot Learning
Authors: Baoquan Zhang, Hao Jiang, Shanshan Feng, Xutao Li, Yunming Ye, Rui Ye
IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on three datasets show our method achieves superior performance over state-of-the-art methods, especially on 1-shot tasks. |
| Researcher Affiliation | Academia | Harbin Institute of Technology, Shenzhen |
| Pseudocode | No | The paper describes the steps of the method in prose but does not include a clearly labeled pseudocode or algorithm block. |
| Open Source Code | No | The paper does not include any statement about releasing source code or provide a link to a code repository. |
| Open Datasets | Yes | mini Imagenet. This is a subset of the Image Net, which contains 100 classes and 600 images per class. ... tiered Imagenet. This dataset is also derived from the Image Net dataset. ... CIFAR-FS. This dataset is constructed from CIFAR100 |
| Dataset Splits | Yes | Following the setting of [Peng et al., 2019], we split the data set into 64 classes for training, 16 classes for validation, and 20 classes for test, respectively. |
| Hardware Specification | No | The paper mentions using 'Res Net12 as our backbone' but does not specify any hardware details such as GPU models, CPU types, or memory. |
| Software Dependencies | No | The paper mentions 'Riemannian Adam optimizer' and 'Res Net12' but does not specify any software versions for libraries, frameworks, or programming languages. |
| Experiment Setup | Yes | These hyper-parameters, γ = 1/640, α = 2, β = 1 are used in our all experiments. For hyper-parameters λc and λr, λc = 1 and λr = 2 are used for mini Imagenet, λc = 2 and λr = 4 are used for tiered Imagenet, and λc = 1 and λr = 2 are used for CIFAR-FS. ... we set the initial learning rate to 0.00001 and then decay it by 0.1 at epochs 50, 80 and 90, respectively. ... with a weight decay of 0.001. |