Inference by Reparameterization in Neural Population Codes

Authors: Rajkumar Vasudeva Raju, Zachary Pitkow

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Simulations with Gaussian graphical models demonstrate that the neural network inference quality is comparable to the direct evaluation of LBP and robust to noise, and thus provides a promising mechanism for general probabilistic inference in the population codes of the brain. We evaluate the performance of our neural network on a set of small Gaussian graphical models with up to 400 interacting variables. The networks time constants were set to have a ratio of slow/ fast = 20. Figure 4A shows the neural population dynamics as the network performs inference, along with the temporal evolution of the corresponding node and pairwise means and covariances. The neural activity exhibits a complicated timecourse, and reflects a combination of many natural parameters changing simultaneously during inference. This type of behavior is seen in neural activity recorded from behaving animals [23, 24, 25]. Figure 4B shows how the performance of the network improves with the ratio of time-scales, γ , slow/ fast. The performance is quantified by the mean squared error in the inferred parameters for a given γ divided by the error for a reference γ0 = 10. Figure 5 shows that our recurrent neural network accurately infers the marginal probabilities, and reaches almost the same conclusions as loopy belief propagation. The data points are obtained from multiple simulations with different graph topologies, including graphs with many loops. Figure 6 verifies that the network is robust to noise even when there are few neurons per inferred parameter; adding more neurons improves performance since the noise can be averaged away.
Researcher Affiliation Academia Rajkumar V. Raju Department of ECE Rice University Houston, TX 77005 rv12@rice.edu Xaq Pitkow Dept. of Neuroscience, Dept. of ECE Baylor College of Medicine, Rice University Houston, TX 77005 xaq@rice.edu
Pseudocode No The paper describes algorithms mathematically and textually but does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement or link regarding the release of open-source code for the described methodology.
Open Datasets No The paper mentions evaluating performance on 'small Gaussian graphical models with up to 400 interacting variables' but does not provide any specific access information (link, DOI, or formal citation with author/year) for a publicly available dataset.
Dataset Splits No The paper does not provide specific dataset split information (percentages, sample counts, or explicit cross-validation setup) for training, validation, or testing.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments or simulations.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with versions) needed to replicate the experiment.
Experiment Setup Yes The networks time constants were set to have a ratio of slow/ fast = 20. independent spatiotemporal Gaussian noise of standard deviation 0.1 times the standard deviation of each signal.