Hybrid Graph Neural Networks for Few-Shot Learning
Authors: Tianyuan Yu, Sen He, Yi-Zhe Song, Tao Xiang3179-3187
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments show that our HGNN obtains new state-of-the-art on three FSL benchmarks. |
| Researcher Affiliation | Collaboration | 1Center for Vision, Speech and Signal Processing, University of Surrey 2National University of Defense Technology 3i Fly Tek-Surrey Joint Research Centre on Artiļ¬cial Intelligence |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | The code and models are available at https://github.com/Tianyuan Yu/HGNN. |
| Open Datasets | Yes | Three widely used FSL benchmarks, Mini Image Net (Vinyals et al. 2016), Tiered Image Net (Ren et al. 2018) and CUB-200-2011 (Wah et al. 2011) are used in our experiments. |
| Dataset Splits | Yes | Mini Image Net... consisting of 64 classes for training, and 16 classes and 20 classes for validation and testing respectively. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU/CPU models or processor types used for running experiments. |
| Software Dependencies | No | The paper mentions the use of various models and networks (e.g., CNNs, GNNs) but does not provide specific software dependencies or library version numbers required for replication. |
| Experiment Setup | No | The paper describes the overall training objectives, model architecture, and general training process (e.g., use of cross-entropy losses, end-to-end training, meta-learning setup) but does not provide concrete numerical values for hyperparameters like learning rate, batch size, or number of epochs. |