Inference of Neural Dynamics Using Switching Recurrent Neural Networks

Authors: Yongxu Zhang, Shreya Saxena

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We apply these models to simulated data as well as cortical neural activity across mice and monkeys, which allows us to automatically detect discrete states that lead to the identification of varying neural dynamics. In a monkey reaching dataset with electrophysiology recordings, a mouse self-initiated lever pull dataset with widefield calcium recordings, and a mouse self-initiated decision making dataset with widefield calcium recording, SRNNs are able to automatically identify discrete states with distinct nonlinear neural dynamics. We validate SRNNs on a simulated dataset, and then analyze the performance of SRNNs on three different experimental data with distinct recording modalities and behavioral tasks with different animals: (1) electrophysiological recordings of single-unit MC activity from a non-human primate performing a reaching task (Churchland et al. [2012]), (2) cortex-wide widefield calcium imaging (WFCI) from mice performing a complex self-initiated decision-making task (Musall et al. [2019]), and (3) WFCI from mice performing a simple self-initiated lever-pull task (Mitelut et al. [2022]).
Researcher Affiliation Academia Yongxu Zhang Shreya Saxena Yale University {yongxu.zhang, shreya.saxena}@yale.edu
Pseudocode No The paper provides schematics and equations for the model (Figure 1, Equations 1-8), but does not include any explicitly labeled pseudocode blocks or algorithm steps.
Open Source Code Yes Our implementation is based on Pytorch 2.2.1 and we train our models using NVIDIA A100 GPUs. We provide the analysis and the results in this paper; the original code for the entire SRNNs framework on Pytorch has been made public https://github.com/saxenalab-neuro/SRNN.
Open Datasets Yes We apply these models to simulated data as well as cortical neural activity across mice and monkeys... In a monkey reaching dataset with electrophysiology recordings... a mouse self-initiated lever pull dataset with widefield calcium recordings, and a mouse self-initiated decision making dataset with widefield calcium recording... We validate SRNNs on a simulated dataset, and then analyze the performance of SRNNs on three different experimental data... Churchland et al. [2012]... Musall et al. [2019]... Mitelut et al. [2022].
Dataset Splits Yes On each dataset, we do N-fold cross-validation, where N equals to the number of conditions, sessions, or subjects in the dataset. All the results in this section are reported on the test set. Additionally, we show example curves of the training loss, reconstruction MSE on validation data, and discrete states recovery error on validation data for SRNNs across different epochs for all three experimental datasets in Figure D.7.
Hardware Specification Yes Our implementation is based on Pytorch 2.2.1 and we train our models using NVIDIA A100 GPUs.
Software Dependencies Yes Our implementation is based on Pytorch 2.2.1 and we train our models using NVIDIA A100 GPUs.
Experiment Setup Yes We use P = 16 for the latent state (h) dimensionality for all three models in Figure 3A. ... We determine the number of discrete latent states via a hyperparameter sweep for SRNNs. ... We minimize the loss function via Adam optimizer. ... We set the variance σ to be constant to reduce the complexity of the model and only optimize the µ Dong et al. [2020], Ganguly and Earp [2021]; we use 0.0001 for σ. ... Here, we use t0 = 10 timepoints, and we also explore the prediction performance by predicting different lengths of neural activity (K [10, 20, 30, 40] timepoints).