Grid-Functioned Neural Networks

Authors: Javier Dehesa, Andrew Vidler, Julian Padget, Christof Lutteroth

ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We present a full characterisation of its computational and spatial complexity, and demonstrate its potential, compared to a traditional architecture, over a set of synthetic regression problems. We further illustrate the benefits through a real-world 3D skeletal animation case study
Researcher Affiliation Collaboration 1 Department of Computer Science, University of Bath, Bath, UK 2 Ninja Theory, Cambridge, UK.
Pseudocode No The paper provides mathematical definitions and equations for the model but does not include any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statement or link to open-source code for the described methodology.
Open Datasets Yes We use the same dataset and input and output encoding presented by Zhang et al. (2018), changing only the model doing the prediction.
Dataset Splits Yes For each problem, 200 points are randomly sampled, 80% of which are used for training and 20% for evaluation.
Hardware Specification Yes Evaluation time measured on an Intel Core i7-7700K CPU running at 4.20 GHz.
Software Dependencies No The paper mentions using “Adam optimisation” and “Re LU activation” but does not provide specific version numbers for any software dependencies, libraries, or frameworks.
Experiment Setup Yes Every model was trained on each problem to minimise the mean squared error at the output using Adam optimisation (Kingma & Ba, 2015). The training ran for 100 000 steps on batches of 32 examples per step with a fixed learning rate of 0.001.