Linear Manifold Regularization with Adaptive Graph for Semi-supervised Dimensionality Reduction

Authors: Kai Xiong, Feiping Nie, Junwei Han

IJCAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments on several benchmark datasets demonstrate the effectiveness of the proposed method.
Researcher Affiliation Academia 1Northwestern Ploytechnical University, Xi an, 710072, P. R. China 2University of Texas at Arlington, USA
Pseudocode Yes Algorithm 1 The Proposed Method LMRAG
Open Source Code No The paper does not provide concrete access to source code for the methodology described.
Open Datasets Yes We use several widely used benchmark datasets JAFFE1, CMU PIE [Sim et al., 2003], UMIST2, YALE, YALE-B3, Corel [Chen et al., 2011] and COIL-204 to evaluate the proposed LMRAG in our experiments. 1http://www.kasrl.org/jaffe.html, 3http://www.cad.zju.edu.cn/home/dengcai/Data/data.html, 4http://www.cs.columbia.edu/CAVE/software/softlib/coil20.php
Dataset Splits Yes We randomly chose 40% samples per class as the training data, and used the remaining 60% as the test data. Among the training data, we randomly selected p = {1, 2, 3} samples per class as the labeled data, and used the remaining as the unlabeled data.
Hardware Specification No The paper does not explicitly describe the hardware used for experiments.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers.
Experiment Setup Yes The parameters α and β in LMRAG, SDA, TR-FSDA and SSDL5, µ and γ in FME, γA and γI in Lap RLS/L need to be tuned, respectively. We searched their values in the range of {10 6, 10 4, 10 2, 100, 102, 104, 106}. For fair comparison, the reduced dimensionality was fixed as c in SDA, TR-FSDA and SSDL. We uniformly set the neighbor number k to 5 and chose the band width σ of Gaussian kernel in a self-tuning way [Chen et al., 2011] while evaluating the classification performance.