Online Neural Connectivity Estimation with Noisy Group Testing
Authors: Anne Draelos, John Pearson
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We tested the performance of Algorithm 1 in both the offline (all data) and online (one test at a time) settings. |
| Researcher Affiliation | Academia | Anne Draelos Biostatistics & Bioinformatics Duke University anne.draelos@duke.edu John M. Pearson Biostatistics & Bioinformatics Electrical & Computer Engineering Neurobiology Duke University john.pearson@duke.edu |
| Pseudocode | Yes | Algorithm 1 Dual decomposition inference |
| Open Source Code | Yes | Code can be found at github.com/pearsonlab/Binary Stim |
| Open Datasets | No | We used randomly generated binary graphs wij in which each link appeared independently with probability K/N. |
| Dataset Splits | No | The paper discusses 'offline (all data)' and 'online (one test at a time)' settings but does not specify explicit train/validation/test dataset splits. |
| Hardware Specification | No | The paper mentions 'efficient GPU implementations' and 'our GPU implementation using Cu Py [35]' but does not provide specific hardware details like GPU model numbers or CPU specifications. |
| Software Dependencies | Yes | our GPU implementation using Cu Py [35] performed each gradient descent iteration in under 2 seconds. Adam [34] with step size 0.01, β1 = 0.9, and β2 = 0.999 for optimization in the offline setting. |
| Experiment Setup | Yes | Unless otherwise stated, we use a base case of N = 1000, K = N^0.3 ~8 incoming connections per neuron, S = 10 stimulated neurons per test, α = β = 0.05, µ = 0, σ = 0.1, and Adam [34] with step size 0.01, β1 = 0.9, and β2 = 0.999 for optimization in the offline setting, with convergence typically achieved within 50 steps. |