SAPE: Spatially-Adaptive Progressive Encoding for Neural Optimization

Authors: Amir Hertz, Or Perel, Raja Giryes, Olga Sorkine-hornung, Daniel Cohen-or

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the advantage of SAPE on a variety of domains and applications, including regression of low dimensional signals and images, representation learning of occupancy networks, and a geometric task of mesh transfer between 3D shapes.
Researcher Affiliation Academia Amir Hertz Tel Aviv University amirhertz@mail.tau.ac.il Or Perel Tel Aviv University orr.perel@gmail.com Raja Giryes Tel Aviv University raja@tauex.tau.ac.il Olga Sorkine-Hornung ETH Zurich, Switzerland sorkine@inf.ethz.ch Daniel Cohen-Or Tel Aviv University cohenor@gmail.com
Pseudocode No The paper states 'The full algorithm is included in the appendix.' but the provided text does not include the appendix, therefore, no pseudocode is present within the given content.
Open Source Code No The paper does not explicitly state that its source code is open or provide a link to a repository. It mentions 'full implementation details appear in the appendix' but this does not guarantee open-source code.
Open Datasets Yes We conduct the evaluation on the same test sets as Tancik et al. [43]... The first set is composed of 10 selected models from the Thingi10K dataset [50]... we test the networks on 20 shapes from the MPEF7 dataset [20].
Dataset Splits No The bandwidths of encoding functions in 3) and 5) are optimally selected by a grid search over a validation set or taken from a public implementation, depending on the task.
Hardware Specification No The paper does not specify any hardware details (e.g., GPU, CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions types of neural networks (MLP, SIREN, FFN) and general concepts, but does not provide specific software names with version numbers required for reproduction (e.g., 'PyTorch 1.9' or 'TensorFlow 2.x').
Experiment Setup Yes All configurations employ 256 unique frequency encodings sampled from a Gaussian distribution... For convergence threshold ε, we set the values of 1e 3 for regression tasks and 1e 2 for geometric tasks. See the appendixfor full description of implementation details.