Direct Training of SNN using Local Zeroth Order Method
Authors: Bhaskar Mukhoty, Velibor Bojkovic, William de Vazelhes, Xiaohan Zhao, Giulia De Masi, Huan Xiong, Bin Gu
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We perform experimental validation of the technique on standard static datasets (CIFAR-10, CIFAR-100, Image Net100) and neuromorphic datasets (DVS-CIFAR-10, DVS-Gesture, N-Caltech-101, NCARS) and obtain results that offer improvement over the state-of-the-art results. |
| Researcher Affiliation | Collaboration | 1 Mohamed bin Zayed University of Artificial Intelligence, UAE 2 ARRC, Technology Innovation Institute, UAE 3 Nanjing University of Information Science and Technology, China 4 School of Artificial Intelligence, Jilin University, China 5 Harbin Institute of Technology, China 6 Bio Robotics Institute, Sant Anna School of Advanced Studies, Pisa, Italy |
| Pseudocode | Yes | Algorithm 1 LOCALZO |
| Open Source Code | Yes | The code is available at https://github.com/Bhaskar Mukhoty/Local ZO. |
| Open Datasets | Yes | standard static image datasets such as CIFAR-10, CIFAR-100[22], Image Net-100[12] and neuromorphic datasets such as DVS-CIFAR-10[24], DVS-Gesture[2], N-Caltech-101[30], N-CARS[37]. |
| Dataset Splits | No | The paper provides train and test image counts for datasets like CIFAR-10, CIFAR-100, and ImageNet-100 (e.g., 'each class respectively have (5000, 1000) train and test images' for CIFAR-10), but does not explicitly detail the size or methodology for a validation set split. |
| Hardware Specification | No | The paper does not provide specific hardware details (such as GPU or CPU models, or memory specifications) used to run its experiments. |
| Software Dependencies | No | The paper mentions 'Optimizer: Adam' but does not specify version numbers for any programming languages, libraries, or other software dependencies. |
| Experiment Setup | Yes | Table 7 provides 'Hyper-parameter settings for general comparison' including 'Number epochs', 'Mini batch size', 'Learning Rate', 'Optimizer: Adam with betas: (0.9; 0.999)', and specific parameters for LIF and LOCALZO (δ, m, λ). |