Expressive probabilistic sampling in recurrent neural networks
Authors: Shirui Chen, Linxing Jiang, Rajesh PN Rao, Eric Shea-Brown
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 4 Experimental results |
| Researcher Affiliation | Academia | Shirui Chen Department of Applied Mathematics University of Washington, Seattle sc256@uw.edu Linxing Preston Jiang Paul G. Allen School of Computer Science & Engineering University of Washington, Seattle prestonj@cs.washington.edu Rajesh P. N. Rao Paul G. Allen School of Computer Science & Engineering and Center for Neurotechnology University of Washington, Seattle rao@cs.washington.edu Eric Shea-Brown Department of Applied Mathematics Computational Neuroscience Center University of Washington, Seattle etsb@uw.edu |
| Pseudocode | Yes | Algorithm 1: Training RSN |
| Open Source Code | Yes | All code is available on Github |
| Open Datasets | Yes | MNIST dataset [32] |
| Dataset Splits | No | The paper mentions using MNIST and CIFAR-10 datasets and training for a certain number of iterations/epochs, but does not explicitly provide the train/validation/test dataset splits with percentages or sample counts. |
| Hardware Specification | Yes | All experiments were run on one NVIDIA Quadro RTX 6000 GPU. |
| Software Dependencies | No | The paper mentions using the Adam optimizer but does not provide specific version numbers for software dependencies or libraries used in the experiments. |
| Experiment Setup | Yes | The learning rate was 0.0001, and the batch size was 128. |