Multi-view Feature Learning with Discriminative Regularization

Authors: Jinglin Xu, Junwei Han, Feiping Nie

IJCAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental evaluations on four widely used datasets and comparisons with a number of state-of-the-art multi-view clustering algorithms demonstrate the superiority of the proposed work. In this section, we evaluate the proposed framework through clustering task on four widely used datasets in terms of four standard clustering evaluation metrics, namely Accuracy (ACC) [Cai et al., 2005], Normalized Mutual Information (NMI) [Cai et al., 2005], Jaccard Index (Jaccard) [Varshavsky et al., 2005] and Purity [Varshavsky et al., 2005].
Researcher Affiliation Academia Jinglin Xu, Junwei Han, Feiping Nie Northwestern Polytechnical University, Xi an, 710072, P. R. China {xujinglinlove, junweihan2010, feipingnie}@gmail.com
Pseudocode Yes Algorithm 1 : The algorithm for solving MVFL.
Open Source Code No The paper does not provide any explicit statements about open-source code availability or links to a code repository for the described methodology.
Open Datasets Yes For example, image data and their label information, which are publicly available and can be downloaded them from Image Net 1, CV Datasets 2 and UCI Machine Learning Repository3. 1http://image-net.org/ 2http://www.cvpapers.com/datasets.html 3https://archive.ics.uci.edu/ml/datasets.html
Dataset Splits No The paper mentions 'training data' and 'testing data' but does not specify a separate 'validation' split for hyperparameter tuning or early stopping.
Hardware Specification No The paper does not provide any specific details about the hardware used to run the experiments, such as GPU or CPU models.
Software Dependencies No The paper does not specify any software dependencies with version numbers (e.g., programming languages, libraries, frameworks).
Experiment Setup Yes For the proposed method, we set the regularized parameter γ = 1 in (3), and tuned the dimension parameters {mi}K i=1(mi < min(di, n)) heuristically by searching the grid with proper step-size. Besides, if Xi HXT i is nearly singular, we can regularize it as Xi HXT i + ϵIdi by introducing a small perturbation ϵ(ϵ = 10 4).