Adaptive Low-Complexity Sequential Inference for Dirichlet Process Mixture Models
Authors: Theodoros Tsiligkaridis, Theodoros Tsiligkaridis, Keith Forsythe
NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate through experiments on synthetic and real data sets that our approach is superior to other online state-of-the-art methods. |
| Researcher Affiliation | Academia | Theodoros Tsiligkaridis, Keith W. Forsythe Massachusetts Institute of Technology, Lincoln Laboratory Lexington, MA 02421 USA |
| Pseudocode | Yes | Algorithm 1 Adaptive Sequential Updating and Greedy Search (ASUGS) |
| Open Source Code | No | The paper does not contain any explicit statements or links indicating that the source code for the described methodology is publicly available. |
| Open Datasets | Yes | We used the MNIST data set, which consists of 60, 000 training samples, and 10, 000 test samples. ... We use only a random 1.667% subset, consisting of 1000 random samples for training. |
| Dataset Splits | Yes | The training set was made up of 500 iid samples, and the test set was made up of 1000 iid samples. |
| Hardware Specification | No | The paper does not specify any hardware details such as CPU, GPU models, or memory specifications used for running the experiments. |
| Software Dependencies | No | The paper does not provide specific version numbers for any software dependencies, libraries, or frameworks used in the implementation or experimentation. |
| Experiment Setup | No | While dataset sizes and splits are mentioned, the paper does not provide specific details about the experimental setup such as hyperparameter values (e.g., learning rate, batch size, number of epochs) or system-level training configurations. |