Reverse Differentiation via Predictive Coding

Authors: Tommaso Salvatori, Yuhang Song, Zhenghua Xu, Thomas Lukasiewicz, Rafal Bogacz8150-8158

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We experimentally analyze the running time of Z-IL, IL, and BP on different architectures. The results of BP, IL, and Z-IL, averaged over all the experiments per model, are reported in Table 2, and a detailed description of the experiments, as well as all the parameters needed to reproduce the results, are provided in the supplementary material.
Researcher Affiliation Academia Tommaso Salvatori 1, Yuhang Song 1, 2 *, Zhenghua Xu 3, Thomas Lukasiewicz 1, Rafal Bogacz 2 1 Department of Computer Science, University of Oxford, UK 2 MRC Brain Network Dynamics Unit, University of Oxford, UK 3 State Key Laboratory of Reliability and Intelligence of Electrical Equipment, Hebei University of Technology, Tianjin, China
Pseudocode Yes Algorithm 1: Learning one training pair ( s, y) with Z-IL; Algorithm 3: Z-IL for computational graphs.
Open Source Code No The paper explicitly states that 'a detailed description of the experiments, as well as all the parameters needed to reproduce the results, are provided in the supplementary material.' However, it does not explicitly mention the release of source code for the methodology or provide a link to a repository.
Open Datasets No The paper discusses experiments on different neural network architectures (MLP, CNNs, RNNs, ResNet18, Transformer) and refers to training with 'data point s' or 'labelled point (s, y)'. However, it does not explicitly name any specific public datasets used, nor does it provide any concrete access information (links, DOIs, formal citations) for any dataset.
Dataset Splits No The paper does not provide specific details about dataset splits (e.g., percentages, sample counts for training, validation, or testing sets). It mentions training on 'a labelled point (s, y)' in Algorithm 1, but no broader dataset splitting strategy is described for the experiments.
Hardware Specification No The paper acknowledges 'the use of the EPSRC-funded Tier 2 facility JADE (EP/P020275/1) and GPU computing support by Scan Computers International Ltd.' However, it does not specify exact GPU models, CPU types, or other detailed hardware components used for the experiments.
Software Dependencies No The paper does not provide specific version numbers for any software dependencies, such as programming languages, libraries, or frameworks used for the experiments.
Experiment Setup Yes The paper states: 'a detailed description of the experiments, as well as all the parameters needed to reproduce the results, are provided in the supplementary material.'