A Convex Exemplar-based Approach to MAD-Bayes Dirichlet Process Mixture Models
Authors: En-Hsu Yen, Xin Lin, Kai Zhong, Pradeep Ravikumar, Inderjit Dhillon
ICML 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In our experiments on several benchmark data sets, the proposed method finds optimal solution of the combinatorial problem and significantly improves existing methods in terms of the exemplar-based objective. |
| Researcher Affiliation | Academia | 1 Department of Computer Science, University of Texas at Austin, TX 78712, USA. 2 Institute for Computational Engineering and Sciences, University of Texas at Austin, TX 78712, USA. 3 Department of Statistics and Data Sciences, University of Texas at Austin, TX 78712, USA. |
| Pseudocode | Yes | Algorithm 1 ADMM for exemplar-based HDP mixture (6) |
| Open Source Code | No | The paper does not provide any explicit statement or link indicating that the source code for the methodology is openly available. |
| Open Datasets | Yes | Our experiments for DP mixture model are conducted on 5 publicly available data sets: Iris, Glass, Wine, DNA and Segment. For HDP mixture model, we experiment on Wholesale and Water data sets. ... All of the data sets can be downloaded from UCI Machine Learning Repository |
| Dataset Splits | No | The paper mentions the use of datasets for experiments but does not provide specific details on training, validation, or test splits, nor does it describe cross-validation setups. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used for running the experiments. |
| Software Dependencies | No | The paper does not specify versions for any software dependencies or libraries used in the implementation or experiments. |
| Experiment Setup | Yes | Since the algorithm gives different result for different order of updating {z(t) i }N i=1, we run the algorithm for 1000 rounds with random permutation on the updating in order to achieve better local optimum. Global mean is used as initialization as specified in (Kulis & Jordan, 2012). ... In our experiment, we fix maximum number of Frank-Wolfe iterations to 30. |