Orthogonal Non-negative Tensor Factorization based Multi-view Clustering
Authors: Jing Li, Quanxue Gao, QIANQIAN WANG, Ming Yang, Wei Xia
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments on various benchmark datasets indicate that our proposed method is able to achieve satisfactory clustering performance.The clustering performances are listed in Table 2 and Table 3. We do some ablation experiments on orthogonal constraint and Schatten p-norm on four datasets as shown in Table 4. We test the convergence of our algorithm by checking the difference between H and the two auxiliary variables. |
| Researcher Affiliation | Academia | Jing Li Xidian University Xi an, Shaanxi, China jinglxd@stu.xidian.edu.cn; Quanxue Gao Xidian University Xi an, Shaanxi, China qxgao@xidian.edu.cn; Qianqian Wang Xidian University Xi an, Shaanxi, China qqwang@xidian.edu.cn; Ming Yang Harbin Engineering University Harbin, Heilongjiang, China yangmingmath@gmail.com; Wei Xia Xidian University Xi an, Shaanxi, China xdweixia@gmail.com |
| Pseudocode | Yes | Algorithm 1 Multi-View Clustering via Orthogonal non-negative Tensor Factorization (Orth-NTF) |
| Open Source Code | Yes | Codes are available: https://github.com/xdjingli/Orth-NTF. |
| Open Datasets | Yes | The following multi-view datasets are selected to examine our proposed method. The details of the datasets are shown in Table 1. MSRC [34]; Hand Written4 [10]; Mnist4 [6]; AWA [11]; Reuters [2]; Noisy MNIST [32]; |
| Dataset Splits | No | The paper mentions using specific datasets for experiments but does not provide details about train/validation/test splits, such as percentages, absolute numbers, or specific methods for creating these splits. It uses the datasets to evaluate "clustering performance". |
| Hardware Specification | Yes | The Reuters and Noisy MNIST are implemented on a standard Windows 10 Server with two Intel (R) Xeon (R) Gold 6230 CPUs 2.1 GHz and 128 GB RAM, MATLAB R2020a. The MSRC, Hand Written4, Mnist4 and AWA are implemented on a laptop computer with an Inter Core i5-8300H CPU and 16 GB RAM, using Matlab R2018b. |
| Software Dependencies | No | The paper only mentions "MATLAB R2020a" and "Matlab R2018b" as software used. It does not provide version numbers for any other ancillary libraries, frameworks, or solvers that might be necessary for replication beyond the core MATLAB environment. |
| Experiment Setup | Yes | The specific hype-parameters on each dataset are as follows: MSRC: anchor rate = 0.7, p = 0.5, λ = 100. Hand Written4: anchor rate = 1.0, p = 0.1, λ = 1180. Mnist4: anchor rate = 0.6, p = 0.1, λ = 5000. AWA: anchor rate = 1.0, p = 0.5, λ = 1000. Reuters: anchor rate = 0.005 (anchor number = 100), p = 0.4, λ = 1209800. Noisy Mnist: anchor rate = 0.03, p = 0.1, λ = 200000. |