Targeting EEG/LFP Synchrony with Neural Nets

Authors: Yitong Li, michael Murias, samantha Major, geraldine Dawson, Kafui Dzirasa, Lawrence Carin, David E. Carlson

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The proposed approach is demonstrated to yield competitive (often state-of-the-art) predictive performance during our empirical tests while yielding interpretable features.
Researcher Affiliation Academia 1Department of Electrical and Computer Engineering, Duke University 2Departments of Psychiatry and Behavioral Sciences, Duke University 3Department of Civil and Environmental Engineering, Duke University 4Department of Biostatistics and Bioinformatics, Duke University
Pseudocode No No pseudocode or algorithm blocks are present in the paper.
Open Source Code No The paper states 'These data will be released with publication of the paper.' (referring to LFP dataset) but makes no explicit mention or provides a link for releasing source code for the described methodology.
Open Datasets Yes UCI EEG: This dataset2 has a total of 122 subjects... (footnote 2 links to https://kdd.ics.uci.edu/databases/eeg/eeg.html); DEAP dataset: The Database for Emotion Analysis using Physiological signals [14]; SEED dataset: This dataset [35].
Dataset Splits Yes UCI EEG: ...which is randomly split as 7 : 1 : 2 for training, validation and testing; DEAP dataset: ...the remaining subjects are split to use 22 for training and 9 for validation; SEED dataset: ...the remaining 14 subjects are split with 10 for training and 4 for validation; ASD dataset: ...where 17 patients are used to train the model and 4 patients are used as a validation set.
Hardware Specification Yes The experiments were run on a 6-core i7 machine with a Nvidia Titan X Pascal GPU.
Software Dependencies No The paper states 'The code is written in Python and TensorFlow' but does not specify version numbers for these or any other software dependencies.
Experiment Setup Yes dropout is instituted at the channel level... p determines the typical percentage of channels included, and was set as p = 0.75. Sync Net used K = 20 filters with filter length 40.