Fixed-Rank Supervised Metric Learning on Riemannian Manifold
Authors: Yadong Mu
AAAI 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Comprehensive numerical experiments conducted on benchmarks clearly suggest that the proposed algorithm is substantially superior or on par with the state-of-the-art in terms of k-NN classification accuracy. We conduct comprehensive evaluations on seven benchmarks for corroborating the effectiveness of the proposed algorithm. |
| Researcher Affiliation | Industry | Yadong Mu AT&T Labs Research Middletown, NJ 07748, U.S.A. Email: myd@research.att.com |
| Pseudocode | Yes | Algorithm 1 The Proposed Optimization Method on Riemannian Manifold |
| Open Source Code | No | No concrete access to source code (specific repository link, explicit code release statement, or code in supplementary materials) was provided. |
| Open Datasets | Yes | We download three datasets, DNA, Splice, Vowel, from the data repository of Lib SVM.2 KDDCup043 represents the Quantum Physics dataset used for KDD data mining competition. CIFAR104 is comprised of images from ten semantic categories, such as airplane and horse . COIL205 is another multi-view image object recognition benchmark established by Columbia University. HAR stands for Human Activity Recognition, which is part of UCI collection6 and contains sensor recordings from smart phone accelerometer and gyroscope. |
| Dataset Splits | No | The paper states: "In all experiments, we randomly sample 50% data as the training set, and the rest for testing purpose." A separate validation dataset split is not explicitly mentioned. |
| Hardware Specification | Yes | All evaluations are conducted on four shared machines in a private large-scale cluster. Each is equipped with 64 CPU cores and 760GB physical memory. |
| Software Dependencies | No | The paper mentions "carefully optimized Matlab code" and using "standard built-in routines in Matlab or source codes obtained from the authors" but does not specify version numbers for Matlab or any other software dependencies. |
| Experiment Setup | Yes | Regarding the parameter tuning, we empirically set the target dimensions of SLPP, LMNN and our proposed FRML (namely the fixed rank k), which are found in the last column of Table 1. ... In Problem (9), λ is set to be 10 4 in all experiments. The L1 sparsity parameter is set as 10 3 for SDML. |