ResNet After All: Neural ODEs and Their Numerical Solution

Authors: Katharina Ott, Prateek Katiyar, Philipp Hennig, Michael Tiemann

ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We verify this adaptation algorithm on a common bench mark dataset as well as a synthetic dataset.In this subsection, we present results from the experiments performed on Sphere2 and CIFAR10 datasets using both fixed step and adaptive step solvers.
Researcher Affiliation Collaboration Katharina Ott Bosch Center for Artificial Intelligence Renningen, Germany University of T ubingen, Germany katharina.ott3@de.bosch.com Prateek Katiyar Bosch Center for Artificial Intelligence Renningen, Germany prateek.katiyar@de.bosch.com Philipp Hennig University of T ubingen MPI for Intelligent Systems T ubingen, Germany philipp.hennig@uni-tuebingen.de Michael Tiemann Bosch Center for Artificial Intelligence Renningen, Germany michael.tiemann@de.bosch.com
Pseudocode Yes Algorithm 1: Step and tolerance adaptation algorithm (Page 5), Algorithm 2: Step adaptation algorithm (Page 13), Algorithm 3: Tolerance adaptation algorithm (Page 14).
Open Source Code Yes Code: https://github.com/boschresearch/numerics_independent_neural_odes
Open Datasets Yes We verify this adaptation algorithm on a common bench mark dataset as well as a synthetic dataset. For our experiments, we introduce a classification task based on the concentric sphere dataset proposed by Dupont et al. (2019). In this subsection, we present results from the experiments performed on Sphere2 and CIFAR10 datasets using both fixed step and adaptive step solvers. For additional results on MNIST we refer to the Supplementary Material Section B.
Dataset Splits No The paper discusses training and testing with different step sizes and solvers, but it does not specify explicit training/validation/test dataset splits or percentages.
Hardware Specification No The paper does not provide any specific hardware details such as GPU/CPU models, memory, or type of computing resources used for the experiments.
Software Dependencies Yes In our code we make use of the following packages: Matplotlib (Hunter, 2007), Numpy (Harris et al., 2020), Pytorch (Paszke et al., 2019) and Torchdiffeq (Chen et al., 2018).
Experiment Setup Yes Hyper-parameters Batch size: 256 Optimizer: SGD (fixed step solvers), Adam (adaptive step size solvers) Learning rate: 1e-2 (fixed step solvers), 1e-4 (adaptive step size solvers) Iterations used for training: 7020 (Page 16, Section D.1 for MNIST). Similar details are provided for CIFAR10 and Concentric Sphere 2D.