Optimized Potential Initialization for Low-Latency Spiking Neural Networks

Authors: Tong Bu, Jianhao Ding, Zhaofei Yu, Tiejun Huang11-20

AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate our algorithm on the CIFAR10, CIFAR-100 and Image Net datasets and achieve state-of-the-art accuracy, using fewer time-steps.
Researcher Affiliation Academia Tong Bu1, 2, Jianhao Ding2, Zhaofei Yu1, 2*, Tiejun Huang1, 2 1Institute for Artificial Intelligence, Peking University 2Department of Computer Science and Technology, Peking University
Pseudocode Yes Algorithm 1: Overall algorithm of ANN to SNN conversion.
Open Source Code No The paper does not explicitly state that its own source code is available or provide a link to it. It only mentions testing a third-party RNL model with 'codes on Git Hub provided by the authors'.
Open Datasets Yes We evaluate the performance of our methods for classification tasks on CIFAR-10, CIFAR-100 and Image Net datasets.
Dataset Splits No The paper mentions using CIFAR-10, CIFAR-100, and ImageNet datasets, which have standard splits, but it does not explicitly state the train/validation/test percentages or sample counts used for reproduction in the provided text.
Hardware Specification Yes All experiments are implemented with Py Torch on a NVIDIA Tesla V100 GPU.
Software Dependencies No The paper mentions 'Py Torch' but does not specify a version number or other software dependencies with their versions.
Experiment Setup Yes The initial learning rates for CIFAR-10 and CIFAR-100 are 0.1 and 0.02, respectively. Each model is trained for 300 epochs. For Image Net dataset, the initial learning rates is set to 0.1 and the total epoch is set to 120. The L2-regularization coefficient of the weights and biases is set to 5 10 4 for CIFAR datasets and 1 10 4 for Image Net. The weight decays of the upper bound parameter θ are 1 10 3 for VGG-16 on CIFAR-10, 5 10 4 for Res Net-18/20 on CIFAR-10, VGG16/ Res Net-18/20 on CIFAR-100, and 1 10 4 for VGG-16 on Image Net.