Interventional Few-Shot Learning
Authors: Zhongqi Yue, Hanwang Zhang, Qianru Sun, Xian-Sheng Hua
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conducted experiments on benchmark datasets in FSL literature: 1) mini Image Net [62] containing 600 images per class over 100 classes. We followed the split proposed in [48]: 64/16/20 classes for train/val/test. ... Table 1: Acc (%) averaged over 2000 5-way FSL tasks before and after applying IFSL. ... Overall, our IFSL achieves the new state-of-the-art on both datasets. |
| Researcher Affiliation | Collaboration | Zhongqi Yue1,3, Hanwang Zhang1, Qianru Sun2, Xian-Sheng Hua3 1Nanyang Technological University, 2Singapore Management University, 3Alibaba Group |
| Pseudocode | No | The paper describes its algorithmic implementations but does not include structured pseudocode or clearly labeled algorithm blocks. |
| Open Source Code | Yes | Code is released at https://github. com/yue-zhongqi/ifsl. |
| Open Datasets | Yes | We conducted experiments on benchmark datasets in FSL literature: 1) mini Image Net [62]... 2) tiered Image Net [49]... 3) Caltech-UCSD Birds-200-2011 (CUB) [65] for crossdomain evaluation. |
| Dataset Splits | Yes | We followed the split proposed in [48]: 64/16/20 classes for train/val/test. |
| Hardware Specification | No | The paper mentions 'donations of GPUs' in the acknowledgements, but does not provide specific details about the hardware used for experiments, such as GPU models, CPU types, or memory. |
| Software Dependencies | No | The paper does not provide specific version numbers for software dependencies or libraries used in the experiments. |
| Experiment Setup | Yes | Training and evaluation settings on mini Image Net and tiered Image Net are included in Appendix 5. (Appendix 5 states: 'In all experiments, we used Adam optimizer [31] with an initial learning rate of 1e-3, which decays by 0.5 every 25 epochs up to 75 epochs. The batch size is 16.') |