SpikeGrad: An ANN-equivalent Computation Model for Implementing Backpropagation with Spikes
Authors: Johannes C. Thiele, Olivier Bichler, Antoine Dupret
ICLR 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate classification accuracies equivalent or superior to existing implementations of SNNs trained with full precision gradients, and comparable to the precision of standard ANNs using similar topologies. This is the first time competitive classification performances are reported on the CIFAR10 and CIFAR100 datasets using a large-scale SNN where both training and inference are fully implemented with spikes. (from Introduction) and 4 EXPERIMENTS Classification performance Tables 1 and 2 compare the state-of-the-art results for SNNs on the MNIST and CIFAR10 datasets. |
| Researcher Affiliation | Academia | Johannes C. Thiele, Olivier Bichler & Antoine Dupret CEA, LIST 91191 Gif-sur-Yvette, France {johannes.thiele,olivier.bichler,antoine.dupret}@cea.fr |
| Pseudocode | Yes | The Spike Grad algorithm can also be expressed in an event-based formulation, described in algorithms 1, 2 and 3. |
| Open Source Code | No | The paper does not provide any explicit statement about releasing source code or a link to a code repository. |
| Open Datasets | Yes | Classification performance Tables 1 and 2 compare the state-of-the-art results for SNNs on the MNIST and CIFAR10 datasets. |
| Dataset Splits | Yes | We separate the training set of size 60000 into 50000 training and 10000 validation examples, which are used to monitor convergence. Testing is performed on the test set of 10000 examples. |
| Hardware Specification | Yes | Training is performed on RTX 2080 Ti graphic cards. |
| Software Dependencies | No | All experiments are performed with custom CUDA/cu DNN accelerated C++ code. (No version numbers provided for CUDA or cuDNN). |
| Experiment Setup | Yes | The hyperparameters for training can be seen in tables 3 and 4. |