Parallel Backpropagation for Shared-Feature Visualization

Authors: Alexander Lappe, Anna Bognár, Ghazaleh Ghamkahri Nejad, Albert Mukovskiy, Lucas Martini, Martin Giese, Rufin Vogels

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We apply the algorithm to novel recordings from body-selective regions in macaque IT cortex in order to understand why some images of objects excite these neurons. and We show results for a novel set of multi-unit recordings from body-selective regions in macaque superior temporal sulcus. and 4 Experimental setup
Researcher Affiliation Academia Alexander Lappe1,2 Anna Bognár3 Ghazaleh Ghamkhari Nejad3 Albert Mukovskiy1 Lucas Martini1,2 Martin A. Giese1 Rufin Vogels3 1Hertie Institute, University Clinics Tübingen 2IMPRS-IS 3 KU Leuven
Pseudocode No The paper describes the procedure in text and with a diagram (Figure 2), but does not contain a formal pseudocode or algorithm block.
Open Source Code Yes Code and data necessary to reproduce all results given in the paper are included in the submission.
Open Datasets Yes The second set comprised 6,857 objects from varying categories shown on the same gray background and was combined from the Open Images dataset [33], as well as several smaller ones [34, 35].
Dataset Splits Yes We split the monkey image set into a training/validation/test split consisting of 400/50/25 images.
Hardware Specification Yes All experiments were run on a single Nvidia RTX 2080Ti.
Software Dependencies No Models and training runs, as well as the visualization procedure were implemented in Py Torch [38]. The paper mentions PyTorch but does not provide a specific version number.
Experiment Setup Yes We set the learning rate to 10 4 and the weight of the regularization to 0.1 After training for 2500 epochs, we selected the model with lowest loss on the validation set.