Parametrizing Product Shape Manifolds by Composite Networks

Authors: Josua Sassen, Klaus Hildebrandt, Martin Rumpf, Benedikt Wirth

ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the effectiveness of our proposed approach with experiments on synthetic data as well as manifolds extracted from data via SPGA. We evaluate our approach with experiments on data manifolds of triangle meshes, both synthetic ones and ones extracted from data via SPGA, and we demonstrate that the proposed composite network architecture outperforms a monolithic fully connected network architecture as well as an approach based on the affine combination of the factors.
Researcher Affiliation Academia Josua Sassen University of Bonn Klaus Hildebrandt TU Delft Martin Rumpf University of Bonn Benedikt Wirth University of M unster
Pseudocode No No pseudocode or clearly labeled algorithm blocks were found in the paper.
Open Source Code Yes The data and the generating code can be found at https://gitlab.com/jrsassen/ freaky-torus.
Open Datasets Yes For the Freaky Torus dataset... The data and the generating code can be found at https://gitlab.com/jrsassen/ freaky-torus. We repeat three of the examples discussed by Sassen et al. (2020b) and consider one new dataset. The repeated examples are a humanoid dataset from (Anguelov et al., 2005), a dataset of face meshes from (Zhang et al., 2004), and a set of hand meshes from (Yeh et al., 2011). For the new example, we examine a humanoid dataset based on SMPL-X (Pavlakos et al., 2019).
Dataset Splits No The dataset was split randomly into a training (80 %) and a test (20 %) set, with the training set being used for the descent method of the loss functionals and the test set being used to evaluate the performance of the networks. A validation split is not explicitly mentioned.
Hardware Specification No No specific hardware details (e.g., GPU model, CPU type, memory size) used for running the experiments were provided.
Software Dependencies No We implemented the neural networks in Py Torch (Paszke et al., 2019) using the Py Torch Geometric library (Fey & Lenssen, 2019). The tools for the NRIC manifold were implemented in C++ based on Open Mesh (Botsch et al., 2002), where we use the Eigen library (Guennebaud et al., 2010) for numerical linear algebra. Specific version numbers for these software packages are not mentioned.
Experiment Setup Yes We used Adam (Kingma & Ba, 2015) as descent method for training all networks, where the initial learning rate was 10 3 and was reduced by a factor of 10 every time the loss did not decrease for multiple iterations. For regularization, we used batch normalization after each layer and a moderate dropout regularization (p = 0.1) after each convolutional layer.