Cross-Spectral Factor Analysis

Authors: Neil Gallagher, Kyle R. Ulrich, Austin Talbot, Kafui Dzirasa, Lawrence Carin, David E. Carlson

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, CSFA discovers networks that are highly predictive of response variables (behavioral context and genotype) for recordings from mice undergoing a behavioral paradigm designed to measure an animal s response to a challenging experience. We further show that incorporating response variables in a supervised multi-objective framework can further map relevant information into a smaller set of features.
Researcher Affiliation Academia Duke University
Pseudocode No The paper describes the model and inference process in text but does not include any clearly labeled 'Pseudocode' or 'Algorithm' blocks or figures.
Open Source Code No The paper does not contain any explicit statement about making its source code publicly available, nor does it provide a link to a code repository.
Open Datasets No The paper states, 'We collected a dataset of LFPs recorded from 26 mice from two different genetic backgrounds (14 wild type, 12 CLOCK 19).' This dataset was collected by the authors and no information is provided to indicate it is publicly available, nor is a link or specific citation for public access provided.
Dataset Splits Yes We used a 5-fold cross-validation approach to select the number of factors, L, the number of spectral Gaussians per factor (i.e. factor complexity), Q, the rank of the cross-spectral density matrix, R, and the additive noise precision, σ. For each validation set, CSFA models were trained for each combination of L 2 {10, 20, 30}, Q 2 {3, 5, 8}, R 2 {1, 2}, σ 2 {5, 20}.
Hardware Specification No The paper does not specify any particular hardware used for running the experiments (e.g., GPU/CPU models, memory, or cloud instance types).
Software Dependencies No The paper mentions using the 'Adam formulation [22]' for optimization and 'multinomial logistic regression [23]' but does not provide specific version numbers for any software libraries or dependencies.
Experiment Setup Yes models were trained for 500 Adam iterations, with a learning rate of 0.01 and other learning parameters set to the defaults suggested in [22]. The kernel parameters were then fixed at their values from the 500th iteration and sufficient additional iterations were carried out until the factor scores, {sw}Ww=1, reached approximate convergence. ...L 2 {10, 20, 30}, Q 2 {3, 5, 8}, R 2 {1, 2}, σ 2 {5, 20}.