Enhancing Training of Spiking Neural Network with Stochastic Latency
Authors: Srinivas Anumasa, Bhaskar Mukhoty, Velibor Bojkovic, Giulia De Masi, Huan Xiong, Bin Gu
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We provide heuristics for our approach with partial theoretical justification and experimental evidence showing the state-of-the-art performance of our models on datasets such as CIFAR-10, DVS-CIFAR10, CIFAR-100, and DVS-Gesture. Our code is available at https://github.com/srinuvaasu/SLT |
| Researcher Affiliation | Academia | 1 Mohamed bin Zayed University of Artificial Intelligence, UAE 2 ARRC, Technology Innovation Institute, UAE 3 Bio Robotics Institute, Sant Anna School of Advanced Studies Pisa, Italy 4 Harbin Institute of Technology, China 5 School of Artificial Intelligence, Jilin University, China |
| Pseudocode | Yes | Algorithm 1: SLT: Stochastic Latency Training |
| Open Source Code | Yes | Our code is available at https://github.com/srinuvaasu/SLT |
| Open Datasets | Yes | Our models on datasets such as CIFAR-10, DVS-CIFAR10, CIFAR-100, and DVS-Gesture. ... CIFAR-10 (Krizhevsky, Hinton et al. 2009) ... DVS-CIFAR-10 (Li et al. 2017) ... DVS-Gesture (Amir et al. 2017) |
| Dataset Splits | No | The paper mentions training and test sets (e.g., '5000 train images and 1000 test images' for CIFAR-10), but does not explicitly provide details for a validation split. |
| Hardware Specification | No | The paper mentions the use of 'GPUs' for training but does not provide specific hardware models, processors, or detailed specifications. |
| Software Dependencies | No | The paper specifies optimizer details like 'Adam' and 'Cosine Ann.' but does not list specific versions of software libraries or programming languages required for replication (e.g., PyTorch version, Python version). |
| Experiment Setup | Yes | Table 1: Hyper-parameter settings for comparison lists specific values for 'No. of epochs', 'Mini batch size', 'LIF: β', 'LIF: u0', 'LIF: uth', 'λT ET', 'Optimiser Adam', 'Learning Rate', 'Adam: Betas', 'Rate Scheduler Cosine Ann.'. |