Solving General Elliptical Mixture Models through an Approximate Wasserstein Manifold
Authors: Shengxi Li, Zeyang Yu, Min Xiang, Danilo Mandic4658-4666
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results demonstrate the excellent performance of the proposed EMM solver. Experimental Results We evaluated the effectiveness of our manifold and the proposed Dadam on both synthetic data and image data, by employing 4 EMMs, i.e., mixtures of Gaussian, Logistic, Cauchy and Gamma (s = 1, a = 2, b = 0.5 in Table 1). |
| Researcher Affiliation | Academia | Shengxi Li,1 Zeyang Yu,1 Min Xiang,1 Danilo Mandic1 1Imperial College London South Kensington Campus London SW7 2AZ, UK {shengxi.li17, z.yu17, m.xiang13, d.mandic}@imperial.ac.uk |
| Pseudocode | Yes | Alg. 1: Riemannian adaptively accelerated manifold optimisation |
| Open Source Code | Yes | The code of this paper is available at https://github.com/ Shengxi Li/wass emm |
| Open Datasets | Yes | Image Data: We adopted the MNIST (Le Cun, Cortes, and Burges 2010) dataset as well as the BSDS500 (Arbelaez et al. 2011) benchmark dataset in our evaluation. |
| Dataset Splits | No | The paper mentions “training and testing sets” but does not provide specific details on train/validation/test dataset splits, percentages, or sample counts. |
| Hardware Specification | Yes | All the methods were run on Matlab 2017a under Intel Core(TM) i7-6700 CPU, where the time was recorded. |
| Software Dependencies | Yes | All the methods were run on Matlab 2017a under Intel Core(TM) i7-6700 CPU, where the time was recorded. |
| Experiment Setup | Yes | Parameter Settings and Metrics: We found the best learning rates α by searching from {0.001, 0.003, 0.01, 0.03, 0.1, 0.3}. Similar to (Kingma and Ba 2014; Becigneul and Ganea 2019), β11 = 0.9 and β2 = 0.999. The maximum number of iterations for testing the synthesis data was 2, 000 and that for the image data was 10, 000. |