Meta Networks
Authors: Tsendsuren Munkhdalai, Hong Yu
ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We extensively studied the performance and the characteristics of Meta Net on one-shot supervised learning (SL) problems under several different settings. Our proposed method not only improves the state-of-the-art results on the standard benchmarks, but also shows some interesting properties related to generalization and continual learning. We carried out one-shot classification experiments on three datasets: Omniglot, Mini-Image Net and MNIST. |
| Researcher Affiliation | Academia | Tsendsuren Munkhdalai 1 Hong Yu 1 1University of Massachusetts, MA, USA. |
| Pseudocode | Yes | Algorithm 1 Meta Net for one-shot supervised learning |
| Open Source Code | Yes | Our code and data will be made available at: https://bitbucket.org/tsendeemts/metanet |
| Open Datasets | Yes | We carried out one-shot classification experiments on three datasets: Omniglot, Mini-Image Net and MNIST. The Omniglot dataset consists of images across 1623 classes with only 20 images per class, from 50 different alphabets (Lake et al., 2015). |
| Dataset Splits | Yes | Following the previous setup Vinyals et al. (2016), we split the Omniglot classes into 1200 and 423 classes for training and testing. The training, dev and testing sets of 64, 16, and 20 Image Net classes (with 600 examples per class) were provided by Ravi & Larochell (2017). |
| Hardware Specification | No | No specific hardware details (e.g., GPU models, CPU types, cloud instance specifications) are mentioned for running the experiments. |
| Software Dependencies | No | No specific software dependencies with version numbers (e.g., 'Python 3.8', 'PyTorch 1.9') are provided in the paper. |
| Experiment Setup | Yes | We used two three-layer MLPs with 64 hidden units as the embedding function and the base learner. The training details are described in Appendix A. |