Bayesian latent structure discovery from multi-neuron recordings

Authors: Scott Linderman, Ryan P. Adams, Jonathan W. Pillow

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We demonstrate the effectiveness of our method with applications to synthetic data and multi-neuron recordings in primate retina, revealing latent patterns of neural types and locations from spike trains alone. and We illustrate the robustness and scalability of our algorithm with synthetic data examples, and we demonstrate the scientific potential of our approach with an application to retinal ganglion cell recordings, where we recover the true underlying cell types and locations from spike trains alone, without reference to the stimulus.
Researcher Affiliation Collaboration Scott W. Linderman Columbia University swl2133@columbia.edu Ryan P. Adams Harvard University and Twitter rpa@seas.harvard.edu Jonathan W. Pillow Princeton University pillow@princeton.edu
Pseudocode No The paper describes its inference algorithm (MCMC, collapsed Gibbs updates) in text, but it does not include any structured pseudocode or an algorithm block.
Open Source Code Yes A Python implementation of our inference algorithm is available at https://github.com/slinderman/pyglm.
Open Datasets Yes Finally, we demonstrate the efficacy of this approach with an application to spike trains simultaneously recorded from a population of 27 retinal ganglion cells (RGCs), which have previously been studied by Pillow et al. [13]. and We thank E. J. Chichilnisky, A. M. Litke, A. Sher and J. Shlens for retinal data.
Dataset Splits No The paper mentions using 'held-out neurons' for predictive log likelihood and 'synthetic data examples', but it does not specify the exact percentages, sample counts, or methodology for splitting the dataset into training, validation, and test sets. It does not explicitly state splits for the synthetic data.
Hardware Specification Yes The following experiments were run on a quad-core Intel i5 with 6GB of RAM.
Software Dependencies No The paper states 'A Python implementation of our inference algorithm is available at https://github.com/slinderman/pyglm.' but does not specify the version of Python or any other software dependencies with version numbers.
Experiment Setup Yes We simulate a one minute recording (1ms time bins) from a population of 200 neurons with discrete latent types... The spikes are generated from a Bernoulli observation model. and we consider scalar weights (K = 1) and use an exponential basis function, φ1[ t] = e t/τ, with time constant of τ = 15ms. and Since the data are binned at 1ms resolution, we have at most one spike per bin and we use a Bernoulli observation model.