Large Associative Memory Problem in Neurobiology and Machine Learning
Authors: Dmitry Krotov, John J. Hopfield
ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | The paper focuses on extending a mathematical model, deriving equations, and clarifying relationships between various models of associative memory. It does not report any empirical studies, data analysis, or experimental results. |
| Researcher Affiliation | Collaboration | Dmitry Krotov MIT-IBM Watson AI Lab IBM Research krotov@ibm.com John Hopfield Princeton Neuroscience Institute Princeton University hopfield@princeton.edu |
| Pseudocode | No | The paper describes mathematical models and derivations but does not include any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statement or link regarding the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and does not involve training models on datasets. It mentions datasets like 'Kuzushiji-Kanji dataset (Clanuwat et al., 2018)' and 'immune repertoire classification, considered in (Widrich et al., 2020)' as examples of problems that could benefit from large associative memory, but it does not use them for its own research. |
| Dataset Splits | No | The paper is theoretical and does not involve dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not involve computational experiments that would require specific hardware specifications. |
| Software Dependencies | No | The paper is theoretical and does not specify any software dependencies with version numbers needed to replicate experimental results. |
| Experiment Setup | No | The paper is theoretical and describes mathematical models and derivations rather than an experimental setup with hyperparameters or training configurations. |