Learning Expressive Meta-Representations with Mixture of Expert Neural Processes
Authors: Qi Wang, Herke van Hoof
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical results demonstrate Mo E-NPs strong generalization capability to unseen tasks in these benchmarks. ... Experimental results and analysis are reported in Section 5. |
| Researcher Affiliation | Academia | Qi Wang Amsterdam Machine Learning Lab University of Amsterdam ... Herke van Hoof Amsterdam Machine Learning Lab University of Amsterdam |
| Pseudocode | Yes | Pseudo code to optimize these functions is listed in Appendix (A). |
| Open Source Code | Yes | The implementation of Mo E-NPs in meta training can be found in Appendix Algorithms (1)/(3), and also please refer to Appendix Algorithms (2)/(4) for the corresponding meta-testing processes. We leave the details of experimental implementations (e.g. parameters, neural architectures, corresponding Py Torch modules and example codes) in Appendix (H). |
| Open Datasets | Yes | We evaluate the performance of models on a system identification task in Acrobot [41] and image completion task in CIFAR10 [42]. ... We use CIFAR10 dataset [42] in this experiment... |
| Dataset Splits | No | The paper describes how context points and target points are used within tasks and refers to 'meta training dataset' and 'test dataset' but does not specify explicit train/validation/test splits (e.g., in percentages or fixed counts) for the overall datasets used in meta-learning. |
| Hardware Specification | No | The paper does not explicitly describe the specific hardware (e.g., GPU models, CPU types, or cloud instance specifications) used for running its experiments in the main text. |
| Software Dependencies | No | The paper mentions 'corresponding Py Torch modules' but does not provide specific version numbers for PyTorch or any other software dependencies needed to replicate the experiment. |
| Experiment Setup | No | We leave the details of experimental implementations (e.g. parameters, neural architectures, corresponding Py Torch modules and example codes) in Appendix (H). |