SpectralNet: Spectral Clustering using Deep Neural Networks
Authors: Uri Shaham, Kelly Stanton, Henry Li, Ronen Basri, Boaz Nadler, Yuval Kluger
ICLR 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our experiments indicate that our network indeed approximates the Laplacian eigenvectors well, allowing the network to cluster challenging non-convex point sets, which recent deep network based methods fail to handle; see examples in Figure 1. Finally, Spetral Net achieves competitive performance on MNIST handwritten digit dataset and state-of-the-art on the Reuters document dataset, whose size makes standard spectral clustering inapplicable. |
| Researcher Affiliation | Academia | Uri Shaham , Kelly Stanton , Henry Li Yale University New Haven, CT, USA {uri.shaham, kelly.stanton, henry.li}@yale.edu Boaz Nadler, Ronen Basri Weizmann Institute of Science Rehovot, Israel {boaz.nadler, ronen.basri}@gmail.com Yuval Kluger Yale University New Haven, CT, USA yuval.kluger@yale.edu |
| Pseudocode | Yes | Algorithm 1: Spectral Net training |
| Open Source Code | Yes | Our implementation is publicly available at https://github.com/kstant0725/Spectral Net. |
| Open Datasets | Yes | Our experiments indicate that our network indeed approximates the Laplacian eigenvectors well, allowing the network to cluster challenging non-convex point sets, which recent deep network based methods fail to handle; see examples in Figure 1. Finally, Spetral Net achieves competitive performance on MNIST handwritten digit dataset and state-of-the-art on the Reuters document dataset, whose size makes standard spectral clustering inapplicable. |
| Dataset Splits | Yes | The learning rate policy for all nets was determined by monitoring the loss on a validation set (a random subset of the training set); once the validation loss did not improve for a speciļ¬ed number of epochs (see patience epochs in Table 3), we divided the learning rate by 10 (see LR decay in Table 3). |
| Hardware Specification | Yes | Our Spectral Net implementation took less than 20 minutes to learn the spectral map on this dataset, using a Ge Force GTX 1080 GPU. |
| Software Dependencies | No | The paper mentions 'Python s sklearn.cluster' and 'ARPACK' but does not provide specific version numbers for these or any other software components. |
| Experiment Setup | Yes | The architectures of the Siamese net and Spectral Net are described in Table 2. Additional technical details are shown in Table 3. |