Versatile Neural Processes for Learning Implicit Neural Representations

Authors: Zongyu Guo, Cuiling Lan, Zhizheng Zhang, Yan Lu, Zhibo Chen

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the effectiveness of the proposed VNP on a variety of tasks involving 1D, 2D and 3D signals. Particularly, our method shows promise in learning accurate INRs w.r.t. a 3D scene without further finetuning. Code is available here. 5 EXPERIMENTS The proposed Versatile Neural Process (VNP), as an efficient meta-learner of implicit neural representations, can be implemented into a variety of tasks. We evaluate the effectiveness of VNP on 1D function regression (subsection 5.1), 2D image completion and superresolution (subsection 5.2), and view synthesis for 3D scenes (subsection 5.3), respectively.
Researcher Affiliation Collaboration Zongyu Guo1 , Cuiling Lan2 , Zhizheng Zhang2 , Yan Lu2 , Zhibo Chen1 1University of Science and Technology of China, 2Microsoft Research Asia
Pseudocode No The paper includes architectural diagrams (Figure 1, Figure 2) to illustrate the framework and decoder structure, but it does not contain structured pseudocode or algorithm blocks.
Open Source Code Yes Code is available here.
Open Datasets Yes We conduct experiments on Celeb A dataset (Liu et al., 2015), mainly with the resized resolution of 64 64. ... We conduct experiments on Shape Net (Chang et al., 2015) objects, including three sub-datasets: cars, lamps, and chairs.
Dataset Splits No The paper discusses context and target ratios for sampling points within signals during training and evaluation (e.g., "we control the context ratio as 0.03... The target ratio is set as 0.15"), and mentions training on synthetic functions, but it does not provide explicit training, validation, or test dataset splits with percentages or sample counts for the overall datasets used, nor does it explicitly mention a separate validation set.
Hardware Specification No The paper discusses computational complexity in terms of GFLOPs but does not provide specific hardware details such as GPU or CPU models, memory specifications, or types of computing clusters used for running the experiments.
Software Dependencies No The paper does not list specific software dependencies with their version numbers (e.g., "Python 3.x, PyTorch 1.x") required to reproduce the experiments.
Experiment Setup No More detailed experimental settings can be found in Appendix B. ... More details on the network structures and hyper parameters can be found in Appendix B.