Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

SpectralNet: Spectral Clustering using Deep Neural Networks

Authors: Uri Shaham, Kelly Stanton, Henry Li, Ronen Basri, Boaz Nadler, Yuval Kluger

ICLR 2018 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments indicate that our network indeed approximates the Laplacian eigenvectors well, allowing the network to cluster challenging non-convex point sets, which recent deep network based methods fail to handle; see examples in Figure 1. Finally, Spetral Net achieves competitive performance on MNIST handwritten digit dataset and state-of-the-art on the Reuters document dataset, whose size makes standard spectral clustering inapplicable.
Researcher Affiliation Academia Uri Shaham , Kelly Stanton , Henry Li Yale University New Haven, CT, USA EMAIL Boaz Nadler, Ronen Basri Weizmann Institute of Science Rehovot, Israel EMAIL Yuval Kluger Yale University New Haven, CT, USA EMAIL
Pseudocode Yes Algorithm 1: Spectral Net training
Open Source Code Yes Our implementation is publicly available at https://github.com/kstant0725/Spectral Net.
Open Datasets Yes Our experiments indicate that our network indeed approximates the Laplacian eigenvectors well, allowing the network to cluster challenging non-convex point sets, which recent deep network based methods fail to handle; see examples in Figure 1. Finally, Spetral Net achieves competitive performance on MNIST handwritten digit dataset and state-of-the-art on the Reuters document dataset, whose size makes standard spectral clustering inapplicable.
Dataset Splits Yes The learning rate policy for all nets was determined by monitoring the loss on a validation set (a random subset of the training set); once the validation loss did not improve for a specified number of epochs (see patience epochs in Table 3), we divided the learning rate by 10 (see LR decay in Table 3).
Hardware Specification Yes Our Spectral Net implementation took less than 20 minutes to learn the spectral map on this dataset, using a Ge Force GTX 1080 GPU.
Software Dependencies No The paper mentions 'Python s sklearn.cluster' and 'ARPACK' but does not provide specific version numbers for these or any other software components.
Experiment Setup Yes The architectures of the Siamese net and Spectral Net are described in Table 2. Additional technical details are shown in Table 3.