Deep Multi-view Subspace Clustering with Anchor Graph
Authors: Chenhang Cui, Yazhou Ren, Jingyu Pu, Xiaorong Pu, Lifang He
IJCAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Empirical studies on realworld datasets show that our method achieves superior clustering performance over other state-of-theart methods. |
| Researcher Affiliation | Academia | 1 School of Computer Science and Engineering, University of Electronic Science and Technology of China, Chengdu, China 2 Shenzhen Institute for Advanced Study, University of Electronic Science and Technology of China 3 Department of Computer Science and Engineering, Lehigh University, Bethlehem, USA |
| Pseudocode | Yes | Algorithm 1 Deep Multi-View Subspace Clustering with Anchor Graph (DMCAG) |
| Open Source Code | No | The paper does not explicitly state that the source code for the described methodology is publicly available, nor does it provide a link to a code repository. |
| Open Datasets | Yes | Datasets. As shown in Table 1, our experiments are carried out on six datasets. Specifically, MNIST-USPS [Peng et al., 2019]... Multi-COIL-10 [Xu et al., 2021b]... BDGP [Cai et al., 2012] UCI-digits1 is a collection of 2000 samples with 3 views, https://archive.ics.uci.edu/ml/datasets/Multiple%2BFeatures... Fashion-MV [Xiao et al., 2017]... Handwritten Numerals (HW)2 contains 2000 samples from 10 categories corresponding to numerals 0-9. https://archive.ics.uci.edu/ml/datasets.php |
| Dataset Splits | No | The paper discusses training the autoencoders and uses different processes like self-supervised learning and contrastive learning, but does not explicitly specify validation dataset splits or a distinct validation phase for model tuning. |
| Hardware Specification | Yes | All experiments are performed on Windows PC with Intel (R) Core (TM) i5-12600K CPU@3.69 GHz, 32.0 GB RAM, and Ge Force RTX 3070ti GPU (8 GB caches). |
| Software Dependencies | No | The paper mentions using convolutional and fully connected neural networks and the Adam optimizer, but it does not specify version numbers for any software libraries or dependencies like Python, PyTorch, or TensorFlow. |
| Experiment Setup | Yes | Following [Kang et al., 2020], we select anchor numbers in the range [10, 100]. We select γ from {0.1, 1, 10}. Temperature parameter τ is set to 1 and α is set to 0.001 for all experiments. |