Enhancing Representation of Spiking Neural Networks via Similarity-Sensitive Contrastive Learning

Authors: Yuhan Zhang, Xiaode Liu, Yuanpei Chen, Weihang Peng, Yufei Guo, Xuhui Huang, Zhe Ma

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results show that our method consistently outperforms the current state-of-the-art algorithms on both popular non-spiking static and neuromorphic datasets.
Researcher Affiliation Collaboration Intelligent Science & Technology Academy of CASIC, Beijing, 100144, China yfguo@pku.edu.cn, mazhe thu@163.com
Pseudocode Yes Finally, the process to train an SNN with the proposed methods will be given in detail by pseudocode. (Algorithm 1: SNN Training with Similarity-Sensitive Contrastive Learning is presented on the following page).
Open Source Code No The paper does not provide any statement or link regarding the availability of open-source code for the described methodology.
Open Datasets Yes We evaluate our method on both static and neuromorphic datasets... CIFAR-10 (Krizhevsky, Nair, and Hinton 2010), CIFAR-100 (Krizhevsky, Nair, and Hinton 2010), Image Net (Deng et al. 2009), and CIFAR10-DVS (Li 2017).
Dataset Splits No The paper mentions datasets like CIFAR-10, CIFAR-100, ImageNet, and CIFAR10-DVS, but does not explicitly provide the train/validation/test dataset splits in the main text. It states 'The details for the datasets and settings are given in the appendix', but no specific percentages or sample counts are provided within the body of the paper.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU models, CPU models, or memory specifications) used for running its experiments.
Software Dependencies No The paper mentions algorithms and methods (e.g., 'STBP algorithm'), but it does not specify any software dependencies with version numbers (e.g., Python version, specific library versions like PyTorch, TensorFlow, or CUDA).
Experiment Setup Yes The hyperparameters for LIF neuron including the firing threshold Uth, the membrane potential decaying τ, and reset potential Ureset were set as 0.5, 0.25, and 0 respectively. All training settings for Image Net are the same as CIFAR dataset but a temperature of 0.07 (τ = 0.07) and training epoches as 320.