Adaptation Accelerating Sampling-based Bayesian Inference in Attractor Neural Networks
Authors: Xingsi Dong, Zilong Ji, Tianhao Chu, Tiejun Huang, Wenhao Zhang, Si Wu
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Simulation results validate our theoretical analyses. |
| Researcher Affiliation | Academia | 1, School of Psychology and Cognitive Sciences, IDG/Mc Govern Institute for Brain Research, PKU-Tsinghua Center for Life Sciences, Academy for Advanced Interdisciplinary Studies, Center of Quantitative Biology, Peking University. 2. Lyda Hill Department of Bioinformatics, O Donnell Brain Institute, UT Southwestern Medical Center. 3. Institute of Cognitive Neuroscience, University College London 4. School of Computer Science, Peking University. |
| Pseudocode | No | The paper provides mathematical equations and descriptions of dynamics but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | Yes | We have included the code for reproducing the main results in SI. |
| Open Datasets | No | The paper uses a 'linear Gaussian generative model' as a theoretical framework for its simulations rather than an external, publicly available dataset. It defines observations and latent features internally. |
| Dataset Splits | No | Since the paper does not use an external dataset, it does not specify train/validation/test splits. The analysis focuses on the convergence of sampled distributions within its theoretical model. |
| Hardware Specification | No | We do not use GPU or CPU clusters, and it is sufficient to run our code on a laptop. |
| Software Dependencies | No | The paper does not provide specific names or version numbers for ancillary software components, libraries, or solvers used in the experiments. |
| Experiment Setup | Yes | For the setting of hyperparameters, see SI.1. |