Gaussian process based nonlinear latent structure discovery in multivariate spike train data
Authors: Anqi Wu, Nicholas A. Roy, Stephen Keeley, Jonathan W. Pillow
NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | 5 Experiments We first examine performance using two simulated datasets generated with different kinds of tuning curves, namely sinusoids and Gaussian bumps. We will compare our algorithm (P-GPLVM) with PLDS, Pf LDS, P-GPFA and GPLVM (see Table 1), using the t LA and d LA inference methods. We also include an additional variant on the Laplace approximation, which we call the approximated Laplace approximation (a LA), where we use only the explicit (first) term in (Eq. 15) to optimize over X for multiple steps given a fixed ˆfi. |
| Researcher Affiliation | Academia | Anqi Wu, Nicholas A. Roy, Stephen Keeley, & Jonathan W. Pillow Princeton Neuroscience Institute Princeton University |
| Pseudocode | Yes | Algorithm 1 Decoupled Laplace approximation at iteration k |
| Open Source Code | No | The paper does not provide any explicit statements about releasing source code, nor does it include links to a code repository. |
| Open Datasets | Yes | We used this hippocampal data to identify a 2D latent space using PLDS, Pf LDS, P-GPFA, GPLVM and P-GPLVMs (Fig. 3), and compared these to the true 2D location of the rodent. [...] M Karlsson, M Carr, and Frank LM. Simultaneous extracellular recordings from hippocampal areas ca1 and ca3 (or mec and ca1) from rats performing an alternation task in two w-shapped tracks that are geometrically identically but visually distinct. crcns.org. http://dx.doi.org/10.6080/K0NK3BZJ, 2005. |
| Dataset Splits | Yes | We split all the time bins in each trial into training time bins (the first 90% time bins) and held-out time bins (the last 10% time bins). |
| Hardware Specification | No | The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, or cloud computing resources) used to perform the experiments. |
| Software Dependencies | No | The paper does not specify version numbers for any software dependencies, programming languages, or libraries used in the implementation. |
| Experiment Setup | No | The paper describes the model architecture and inference algorithm but does not provide specific details regarding experimental setup parameters such as hyperparameter values (e.g., learning rates, batch sizes, optimization settings). |