Adaptive Neural Trees
Authors: Ryutaro Tanno, Kai Arulkumaran, Daniel Alexander, Antonio Criminisi, Aditya Nori
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We empirically validate these benefits for regression and classification through experiments on the SARCOS (Vijayakumar & Schaal, 2000), MNIST (Le Cun et al., 1998) and CIFAR-10 (Krizhevsky & Hinton, 2009) datasets. |
| Researcher Affiliation | Collaboration | 1University College London, UK 2Imperial College London, UK 3Microsoft Research, Cambridge, UK. |
| Pseudocode | Yes | We include a pseudocode of the training algorithm in Supp. Sec. A. |
| Open Source Code | Yes | Codes: https://github.com/rtanno21609/Adaptive Neural Trees |
| Open Datasets | Yes | We evaluate ANTs using the SARCOS multivariate regression dataset (Vijayakumar & Schaal, 2000), and the MNIST (Le Cun et al., 1998) and CIFAR-10 (Krizhevsky & Hinton, 2009) classification datasets. |
| Dataset Splits | Yes | The best model is picked based on the performance on the same validation set of 5k examples as before. |
| Hardware Specification | No | Full training details, including training times on a single GPU, are provided in Supp. Sec. C and D. |
| Software Dependencies | No | All of our models are implemented in Py Torch (Paszke et al., 2017)1. |
| Experiment Setup | No | Full training details, including training times on a single GPU, are provided in Supp. Sec. C and D. |