A Plasticity-Centric Approach to Train the Non-Differential Spiking Neural Networks

Authors: Tielin Zhang, Yi Zeng, Dongcheng Zhao, Mengting Shi

AAAI 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally we get the accuracy of 98.52% on the hand-written digits classification task on MNIST.
Researcher Affiliation Academia 1Institute of Automation, Chinese Academy of Sciences, Beijing, China 2Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai, China 3University of Chinese Academy of Sciences, Beijing, China
Pseudocode Yes Algorithm 1 The Algorithm of SNN Learning.
Open Source Code No The paper does not provide any explicit statement about releasing its source code or a link to a code repository for the described methodology.
Open Datasets Yes The Modified National Institute of Standards and Technology (MNIST) dataset with ten classes of hand-written digits (from zero to nine) is used to test the performance of the proposed SNN algorithm.
Dataset Splits No Here we use standard 60,000 MNIST data to train and another 10,000 to test (no cross validation).
Hardware Specification No The paper does not provide specific details about the hardware (e.g., GPU models, CPU types, memory) used to run its experiments.
Software Dependencies No The paper does not mention any specific software dependencies with version numbers.
Experiment Setup Yes We set iteration time as 100, the patch size as 10, and the learning rate as 0.05.