Learning a neural response metric for retinal prosthesis
Authors: Nishal P Shah, Sasidhar Madugula, EJ Chichilnisky, Yoram Singer, Jonathon Shlens
ICLR 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Using data from electrical stimulation experiments, we demonstrate that the learned metric could produce improvements in the performance of a retinal prosthesis.3.1 EXPERIMENTAL SETUP Spiking responses from hundreds of retinal ganglion cells (RGCs) in primate retina were recorded using a 512 electrode array system (Litke et al., 2004; Frechette et al., 2005). |
| Researcher Affiliation | Collaboration | 1Stanford University 2Google Brain 3Princeton University |
| Pseudocode | No | The paper describes the network architecture in Figure 6 and Table 1 but does not include structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide concrete access to source code (no specific repository link, explicit code release statement, or code in supplementary materials). |
| Open Datasets | No | Spiking responses from hundreds of retinal ganglion cells (RGCs) in primate retina were recorded using a 512 electrode array system (Litke et al., 2004; Frechette et al., 2005). |
| Dataset Splits | No | The responses were partitioned into training (first 8 seconds) and testing (last 2 seconds) of each trial. |
| Hardware Specification | No | The paper does not provide specific hardware details (exact GPU/CPU models, processor types, or memory) used for running its experiments. |
| Software Dependencies | No | Optimizer Adam (Kingma & Ba, 2014) (α = 0.01, β1 = 0.9, β2 = 0.999) Parameter updates 20,000 Batch size 100 Weight initialization Xavier initialization (Glorot & Bengio, 2010) (The paper mentions optimizers and initialization methods, but does not provide specific software dependencies with version numbers for reproducibility, such as Python or library versions.) |
| Experiment Setup | Yes | Optimizer Adam (Kingma & Ba, 2014) (α = 0.01, β1 = 0.9, β2 = 0.999) Parameter updates 20,000 Batch size 100 Weight initialization Xavier initialization (Glorot & Bengio, 2010) |