Episodic Multi-Task Learning with Heterogeneous Neural Processes

Authors: Jiayi Shen, Xiantong Zhen, Qi Wang, Marcel Worring

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results show the superior performance of the proposed HNPs over typical baselines, and ablation studies verify the effectiveness of the designed inference modules.
Researcher Affiliation Collaboration 1University of Amsterdam, Netherlands, {j.shen, m.worring}@uva.nl 2 Inception Institute of Artificial Intelligence, Abu Dhabi, UAE, zhenxt@gmail.com 3 Kaiyuan Mathematical Sciences Institute, Changsha, China, hhq123go@gmail.com
Pseudocode Yes Please refer to Appendix E for algorithms.
Open Source Code Yes Our code 3 is provided to facilitate such extensions. 3 https://github.com/autumn9999/HNPs.git
Open Datasets Yes We use Office-Home [91] and Domain Net [92] as episodic multi-task classification datasets.
Dataset Splits No The paper mentions "numbers of meta-training classes and meta-test classes" for the datasets, but does not provide specific numerical data splits (e.g., percentages or sample counts) for training, validation, and testing sets needed for reproduction.
Hardware Specification No The paper does not provide specific hardware details such as GPU or CPU models, memory, or specific computing environments used for running experiments.
Software Dependencies No The paper does not list specific version numbers for any software dependencies, libraries, or frameworks used in the implementation.
Experiment Setup Yes Following [12, 18], function-fitting tasks are generated with Gaussian processes (GPs). Here a zero mean Gaussian process y(0) GP(0, k( , )) is used to produce y1:4 τ for the inputs from all tasks x1:4 τ . A radial basis kernel k(x, x ) = σ2 exp( (x x )2)/2l2), with l = 0.4 and σ = 1.0 is used.