On the mapping between Hopfield networks and Restricted Boltzmann Machines
Authors: Matthew Smart, Anton Zilman
ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | we conduct experiments on the MNIST dataset which suggest the mapping provides a useful initialization to the RBM weights. and 3 EXPERIMENTS ON MNIST DATASET |
| Researcher Affiliation | Academia | Matthew Smart Department of Physics University of Toronto msmart@physics.utoronto.ca Anton Zilman Department of Physics and Institute for Biomedical Engingeering University of Toronto zilmana@physics.utoronto.ca |
| Pseudocode | No | The paper does not contain any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | No explicit statement about releasing code or links to a source code repository for the described methodology were found. |
| Open Datasets | Yes | We consider the popular MNIST dataset of handwritten digits (Le Cun et al., 1998) |
| Dataset Splits | No | The paper mentions training and testing images but does not explicitly provide details about a validation dataset split. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, processor types, or memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper mentions 'scikit-learn (Pedregosa et al., 2011)' but does not provide specific version numbers for it or any other software dependencies. |
| Experiment Setup | Yes | In our experiments we train for 50 epochs with mini-batches of size 100 (3 105 weight updates) (...) The learning rate is η0 = 10 4 except the first 25 epochs of the randomly initialized weights in (b), where we used η = 5η0 due to slow training. (...) Training parameters: β = 2, and CD-20. |