Biological Learning of Irreducible Representations of Commuting Transformations

Authors: Alexander Genkin, David Lipshutz, Siavash Golkar, Tiberiu Tesileanu, Dmitri Chklovskii

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Simulations here are intended to demonstrate the following properties of our algorithms.
Researcher Affiliation Academia Alexander Genkin* David Lipshutz Siavash Golkar Tiberiu Te sileanu Dmitri B. Chklovskii*, *Neuroscience Institute, NYU Langone Medical School Center for Computational Neuroscience, Flatiron Institute
Pseudocode Yes Algorithm 1: The SVD algorithm with deflation
Open Source Code Yes All code for these experiments is included in the Supplementary material.
Open Datasets Yes We used natural images from the Van Hateren database [8] and digits from the MNIST dataset [9].
Dataset Splits No The paper describes data generation and input sizes for simulations, and refers to general 'training details' in the checklist, but does not specify explicit train/validation/test splits (e.g., percentages or counts) for the datasets used.
Hardware Specification Yes This experiment took 14 minutes total on a Mac Book Pro with 3.5 GHz Dual-Core Intel Core i7 processor.
Software Dependencies No The paper mentions 'multi-layer perceptron' and 'bi-linear approximation' but does not specify any software libraries or their version numbers.
Experiment Setup Yes Learning rates were manually selected to be 5 10 4 for both algorithms.