Neural Clustering Processes

Authors: Ari Pakman, Yueqi Wang, Catalin Mitelut, Jinhyung Lee, Liam Paninski

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In Section 5 we present two simple examples to illustrate the methods. ... In Section 8 with a neuroscientific application to spike sorting for high-density multielectrode probes. ... We trained NCP for spike clustering using synthetic spikes from a simple yet effective generative model that mimics the distribution of real spikes, and evaluated the spike sorting performance on labeled synthetic data, unlabeled real data and hybrid test data by comparing NCP against two other methods
Researcher Affiliation Collaboration Ari Pakman 1 Yueqi Wang 1 2 Catalin Mitelut 1 Jin Hyung Lee 1 Liam Paninski 1 1Columbia University 2Now at Google.
Pseudocode Yes Algorithm 1 O(NK) Neural Clustering Process
Open Source Code Yes Implementation available at https://github.com/ aripakman/neural_clustering_process
Open Datasets Yes MNIST digits: We consider next a DPMM over MNIST digits... Training was performed by sampling xi from the MNIST training set.
Dataset Splits No The paper mentions using 'MNIST training set' and 'MNIST test set', but does not provide specific percentages or counts for training, validation, and test splits, nor does it specify how splits were defined for the 2D Gaussian models or spike sorting data beyond general descriptions.
Hardware Specification No The paper mentions running experiments on a 'GPU' and using a 'GPU-based NCP implementation', but does not specify the exact GPU model or any other hardware components like CPU or RAM.
Software Dependencies No The paper mentions using a 'convolutional neural network' and refers to a 'Kilosort' implementation (a state-of-the-art method for comparison), but does not list any specific software dependencies (e.g., libraries, frameworks) with version numbers for its own methodology.
Experiment Setup No The paper explicitly states that details on the neural architecture and training/inference pipeline are in the Supplementary Material (SM Sections D and G), but these specific experimental setup details (e.g., hyperparameters) are not provided in the main body of the paper.