Geometric Understanding for Unsupervised Subspace Learning

Authors: Shihui Ying, Lipeng Cai, Changzhou He, Yaxin Peng

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Finally, we compare the proposed approach with six state-of-the-art methods on two different kinds of real datasets. The experimental results validate that our proposed method outperforms all compared methods. and To demonstrate the effectiveness of the proposed subspace learning algorithm, in this section, experiments of classification by using the k-NN classifier on learned subspace (reduced feature space) are conducted on two real datasets, i.e., COIL-100 object dataset [Nene et al., 1996], and MNIST digit dataset [Lecun et al., 1998].
Researcher Affiliation Collaboration Shihui Ying1 , Lipeng Cai1 , Changzhou He2 and Yaxin Peng1 1Department of Mathematics, School of Science, Shanghai University, Shanghai, China 2Qualcomm (Shanghai) Co. Ltd., China {shying, xiaocaibao77}@shu.edu.cn, changzhouhe@163.com, yaxin.peng@shu.edu.cn
Pseudocode Yes Algorithm 1 Intrinsic algorithm for subspace learning
Open Source Code No The paper does not provide any statement or link indicating that the source code for the methodology is openly available.
Open Datasets Yes COIL-100 object dataset [Nene et al., 1996], and MNIST digit dataset [Lecun et al., 1998].
Dataset Splits No For COIL-100, the paper states: 'randomly select 10 images of each object as the set of training samples, and the rest as the set of testing samples.' For MNIST, it describes selecting training samples ('10, 20, and 30 images of each class as the set of training samples') from a subset of the dataset. While training and testing splits are described, there is no explicit mention or description of a separate validation split.
Hardware Specification Yes All programs are written in Matlab 2013a and run by PC with Intel(R) Core(TM) i7-7500U CPU and 32 GB RAM.
Software Dependencies Yes All programs are written in Matlab 2013a
Experiment Setup No The paper mentions using a k-NN classifier and performing experiments 20 times randomly for repeatability. It also notes that 'λ is a balanced parameter' in their model, but the specific value of λ used in the experiments is not provided, nor are other typical hyperparameters for the methods or the k-NN classifier.