Modeling Dynamic Functional Connectivity with Latent Factor Gaussian Processes

Authors: Lingge Li, Dustin Pluta, Babak Shahbaba, Norbert Fortin, Hernando Ombao, Pierre Baldi

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We present a latent factor Gaussian process model which addresses these challenges by learning a parsimonious representation of connectivity dynamics. As an illustration of the scientific utility of the model, application to a data set of rat local field potential activity recorded during a complex non-spatial memory task provides evidence of stimuli differentiation.
Researcher Affiliation Academia Lingge Li UC Irvine linggel@uci.edu Dustin Pluta UC Irvine dpluta@uci.edu Babak Shahbaba UC Irvine babaks@uci.edu Norbert Fortin UC Irvine norbert.fortin@uci.edu Hernando Ombao KAUST hernando.ombao@kaust.edu.sa Pierre Baldi UC Irvine pfbaldi@ics.uci.edu
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any statement about releasing source code or a link to a code repository for the described methodology.
Open Datasets Yes We here propose a novel latent factor Gaussian process (LFGP) model for DFC estimation and apply it to a data set of rat hippocampus LFP during a non-spatial memory task [7]. [7] is a citation for the dataset: Timothy A Allen, Daniel M Salz, Sam Mc Kenzie, and Norbert J Fortin. Nonspatial sequence coding in ca1 neurons. Journal of Neuroscience, 36(5):1547 1563, 2016.
Dataset Splits No The paper does not provide specific details on train/validation/test splits with percentages, sample counts, or predefined split citations. It mentions
Hardware Specification Yes All simulations are run on a 2.7 GHz Intel Core i5 Macbook Pro laptop with 8GB memory.
Software Dependencies Yes For sampling from the loading posterior distribution, we use the No-U-Turn Sampler [33] as implemented in Py Stan [34]. [34] refers to "Bob Carpenter, Andrew Gelman, Matthew D Hoffman, Daniel Lee, Ben Goodrich, Michael Betancourt, Marcus Brubaker, Jiqiang Guo, Peter Li, and Allen Riddell. Stan: A probabilistic programming language. Journal of statistical software, 76(1), 2017."
Experiment Setup Yes For the SW-PCA model, the sliding window size is 50 and the number of principal components is 4. For the HMM, the number of hidden states is increased gradually until the model does not converge, following the implementation outlined in [37]. The prior for GP length scale is a Gamma distribution concentrated around 100ms on the time scale to encourage learning frequency dynamics close to the theta range (4-12 Hz). For the loadings and variances, we use the Gaussian-Inverse Gamma conjugate priors. 20,000 MCMC draws are taken with the first 5000 draws discarded as burn-in.