Optimal prior-dependent neural population codes under shared input noise

Authors: Agnieszka Grabska-Barwinska, Jonathan W Pillow

NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Here we analyze population coding under a simple alternative model in which latent input noise corrupts the stimulus before it is encoded by the population. This provides a convenient and tractable description for irreducible uncertainty that cannot be overcome by adding neurons, and induces stimulus-dependent correlations that mimic certain aspects of the correlations observed in real populations. We examine prior-dependent, Bayesian optimal coding in such populations using exact analyses of cases in which the posterior is approximately Gaussian. These analyses extend previous results on independent Poisson population codes and yield an analytic expression for squared loss and a tight upper bound for mutual information.
Researcher Affiliation Academia Agnieszka Grabska-Barwi nska Gatsby Computational Neuroscience Unit University College London agnieszka@gatsby.ucl.ac.uk Jonathan W. Pillow Princeton Neuroscience Institute Department of Psychology Princeton University pillow@princeton.edu
Pseudocode No The paper contains mathematical derivations and equations but no pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any links to open-source code or state that code for the described methodology is available.
Open Datasets No The paper is theoretical and uses abstract concepts like a "zero-mean Gaussian prior" for stimuli. It does not utilize or provide access to a concrete, publicly available dataset for training or evaluation.
Dataset Splits No The paper is theoretical and analytical; therefore, it does not involve dataset splits (e.g., training, validation, test) for empirical evaluation.
Hardware Specification No The paper is theoretical and focuses on neural population codes; it does not describe any specific hardware used for computational experiments.
Software Dependencies No The paper focuses on mathematical analysis and derivations. It references the NIST Digital Library of Mathematical Functions but does not list any specific software dependencies with version numbers used for implementation or analysis.
Experiment Setup No As a theoretical paper, it does not describe an experimental setup in terms of hyperparameters or system-level training settings.