Free Lunch for Few-shot Learning: Distribution Calibration
Authors: Shuo Yang, Lu Liu, Min Xu
ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In the experiments, we show that a simple logistic regression classifier trained with our strategy can achieve state-of-the-art accuracy on three datasets. |
| Researcher Affiliation | Academia | Shuo Yang1, Lu Liu2, Min Xu1 1School of Electrical and Data Engineering, University of Technology Sydney, 2Australian Artificial Intelligence Institute, University of Technology Sydney {shuo.yang, lu.liu-10}@student.uts.edu.au, min.xu@uts.edu.au |
| Pseudocode | Yes | Algorithm 1 Training procedure for an N-way-K-shot task |
| Open Source Code | Yes | The code is available at: https://github.com/Shuo Yang-1998/Few_ Shot_Distribution_Calibration |
| Open Datasets | Yes | We evaluate our distribution calibration strategy on mini Image Net (Ravi & Larochelle (2017)), tiered Image Net (Ren et al. (2018)) and CUB (Welinder et al. (2010)). |
| Dataset Splits | Yes | mini Image Net... split the dataset into 64 base classes, 16 validation classes, and 20 novel classes. |
| Hardware Specification | No | For feature extractor, we use the Wide Res Net (Zagoruyko & Komodakis, 2016) trained following previous work (Mangla et al. (2020)). |
| Software Dependencies | No | We use the LR and SVM implementation of scikit-learn (Pedregosa et al. (2011)) with the default settings. |
| Experiment Setup | Yes | Specifically, the number of generated features is 750; k = 2 and λ = 0.5. α is 0.21, 0.21 and 0.3 for mini Image Net, tiered Image Net and CUB, respectively. |