How connectivity structure shapes rich and lazy learning in neural circuits
Authors: Yuhan Helena Liu, Aristide Baratin, Jonathan Cornford, Stefan Mihalas, Eric Todd SheaBrown, Guillaume Lajoie
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through both empirical and theoretical analyses, we discover that high-rank initializations typically yield smaller network changes indicative of lazier learning, a finding we also confirm with experimentally-driven initial connectivity in recurrent neural networks. |
| Researcher Affiliation | Collaboration | 1University of Washington, Seattle, WA, USA 2Allen Institute for Brain Science, Seattle WA, USA 3Mila Quebec AI Institute, Montreal, QC, Canada 4Samsung SAIT AI Lab, Montreal, QC, Canada 5Mc Gill University, Montreal, QC, Canada 6Canada CIFAR AI Chair, CIFAR, Toronto, ON, Canada 7Université de Montréal, Montreal, QC, Canada |
| Pseudocode | No | The paper does not contain any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code or links to a code repository. |
| Open Datasets | Yes | For our investigations, we applied this initialization scheme across a variety of cognitive tasks including two-alternative forced choice (2AF), delayed-match-to-sample (DMS), context-dependent decision-making (CXT) tasks implemented with Neurogym (Molano-Mazon et al., 2022) and the well-known machine learning benchmark sequential MNIST (s MNIST). |
| Dataset Splits | No | The paper does not provide specific dataset split information (exact percentages, sample counts, or detailed splitting methodology) for reproduction. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions general tools like PyTorch in its references, but it does not specify any software dependencies with version numbers used for its experimental setup. |
| Experiment Setup | No | The paper states 'Details of parameter settings can be found in Appendix C.' and refers to Appendix C for other setup specifics, but these details are not provided in the main text. |