A Simple Approach to Automated Spectral Clustering
Authors: Jicong Fan, Yiheng Tu, Zhao Zhang, Mingbo Zhao, Haijun Zhang
NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Extensive experiments of natural image clustering show that our method is more versatile, accurate, and efficient than baseline methods. |
| Researcher Affiliation | Academia | 1The Chinese University of Hong Kong, Shenzhen 2Shenzhen Research Institute of Big Data 3Chinese Academy of Science, Beijing 4University of Chinese Academy of Sciences, Beijing 5Hefei University of Technology, Hefei 6Donghua University, Shanghai 7Harbin Institute of Technology, Shenzhen |
| Pseudocode | Yes | The procedures are summarized into Algorithm 2 (see Appendix B). |
| Open Source Code | Yes | Our MATLAB codes are available at https://github.com/jicongfan/Automated-Spectral-Clustering. |
| Open Datasets | Yes | We test our Auto SC on Extended Yale B Face [Kuang-Chih et al., 2005], ORL Face [Samaria and Harter, 1994], COIL20 [Nene et al., 1996], AR Face [Mart ınez and Kak, 2001], MNIST [Le Cun et al., 1998], Fashion-MNIST [Xiao et al., 2017], GTSRB [Stallkamp et al., 2012], subsets and extracted features of MNIST and Fashion-MNIST. |
| Dataset Splits | No | The paper uses benchmark datasets like MNIST and Fashion-MNIST, but it does not explicitly provide information on the training, validation, or test dataset splits (e.g., percentages or counts) within the main text. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., GPU/CPU models, memory, or cloud instances) used for running the experiments. |
| Software Dependencies | No | The paper mentions that 'Our MATLAB codes are available at https://github.com/jicongfan/Automated-Spectral-Clustering', indicating the use of MATLAB, but it does not specify a version number for MATLAB or any other software dependencies. |
| Experiment Setup | Yes | The paper mentions specific experimental details such as using 'mini-batch Adam' for optimization, 'Gaussian kernel with ς = 1 n2 Pij xi xj' for the kernel, and notes that 'The parameter settings are in Appendix D.6.' |