Structured Recognition for Generative Models with Explaining Away
Authors: Changmin Yu, Hugo Soulat, Neil Burgess, Maneesh Sahani
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We instantiate the framework in nonlinear Gaussian Process Factor Analysis, evaluating the structured recognition framework using synthetic data from known generative processes. We fit the GPFA model to high-dimensional neural spike data from the hippocampus of freely moving rodents, where the model successfully identifies latent signals that correlate with behavioural covariates. |
| Researcher Affiliation | Academia | Changmin Yu1,2 Hugo Soulat2 Neil Burgess1 Maneesh Sahani2 1Institute of Cognitive Neuroscience; 2Gatsby Computational Neuroscience Unit; UCL, London, United Kingdom {changmin.yu.19; hugo.soulat.19; n.burgess}@ucl.ac.uk; maneesh@gatsby.ucl.ac.uk |
| Pseudocode | No | The paper does not contain any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | Yes | Python implementation can be found at https://github.com/gatsby-sahani/ structured-recognition-neurips2022 |
| Open Datasets | Yes | We used single-cell spiking data from neurons in the hippocampal CA1 and m EC regions of rats recorded during exploration of a Z-shaped track, as reported by Ólafsdóttir et al. [41]. |
| Dataset Splits | No | The paper mentions training, but does not explicitly state the dataset splits for train, validation, and test sets. It mentions 'short batches of data' for training but no specific percentages or counts. |
| Hardware Specification | No | The paper mentions computational constraints and training but does not specify the exact hardware used for the experiments (e.g., specific GPU or CPU models, memory details). |
| Software Dependencies | Yes | Pytorch: An imperative style, high-performance deep learning library. In H. Wallach, H. Larochelle, A. Beygelzimer, F. d'AlchéBuc, E. Fox, and R. Garnett, editors, Advances in Neural Information Processing Systems 32, pages 8024 8035. Curran Associates, Inc., 2019. URL http://papers.neurips.cc/paper/ 9015-pytorch-an-imperative-style-high-performance-deep-learning-library. pdf. Diederik P Kingma and Jimmy Ba. Adam: A method for stochastic optimization. ar Xiv preprint ar Xiv:1412.6980, 2014. |
| Experiment Setup | Yes | All models were implemented with the same recognition and generative network architectures (see Appendix G for implementation details). For SR-nl GPFA and SGP-VAE, we adopted the GPFA generative model Eq. 3 with DGM non-linearity and Poisson observation likelihood. Computational constraints meant that the SR-nl GPFA model was trained using short batches of data and 64 inducing points per batch. |