DMN4: Few-Shot Learning via Discriminative Mutual Nearest Neighbor Neural Network

Authors: Yang Liu, Tu Zheng, Jie Song, Deng Cai, Xiaofei He1828-1836

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate that our method outperforms the existing state-of-the-arts on both fine-grained and generalized datasets.
Researcher Affiliation Collaboration Yang Liu1, Tu Zheng1,2, Jie Song3, Deng Cai1,2, Xiaofei He1,2 1State Key Lab of CAD&CG, College of Computer Science, Zhejiang University 2Fabu Inc., Hangzhou, China 3Zhejiang University
Pseudocode No No pseudocode or clearly labeled algorithm blocks were found.
Open Source Code No No explicit statement or link for the open-sourcing of the described methodology's code was found.
Open Datasets Yes mini Image Net (Vinyals et al. 2016) is a subset of Image Net containing randomly selected 100 classes... Caltech-UCSD Birds-200-2011 (CUB) (Wah et al. 2011)... meta-i Nat (Wertheimer and Hariharan 2019)
Dataset Splits Yes We follow the setup provided by Sachin and Hugo that takes 64, 16 and 20 classes for training, validation and evaluation respectively.
Hardware Specification No No specific hardware details (such as GPU/CPU models, memory amounts, or detailed computer specifications) used for running experiments were provided.
Software Dependencies No No specific software dependencies with version numbers (e.g., Python 3.8, PyTorch 1.9, CUDA 11.1) were mentioned. Only optimizers like Adam and SGD are noted.
Experiment Setup Yes We meta-train Conv-4 from scratch for 30 epochs by Adam optimizer with learning rate 1e-3 and decay 0.1 every 10 epochs. With regard to Res Net-12, we first pre-trained it like in the previous literature and then meta-train it by momentum SGD for 40 epochs. The learning rate in meta-training is set 5e-4 for Res Net-12 and decay 0.5 every 10 epochs.