Deep Spectral Clustering Learning
Authors: Marc T. Law, Raquel Urtasun, Richard S. Zemel
ICML 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on standard real-world datasets confirm state-of-the-art Recall@K performance. |
| Researcher Affiliation | Academia | 1Department of Computer Science, University of Toronto, Toronto, Canada 2CIFAR Senior Fellow. |
| Pseudocode | Yes | Algorithm 1 Deep Spectral Clustering Learning (DSCL). Algorithm 2 DSCL Normalized Spectral Clustering |
| Open Source Code | No | The paper does not provide an explicit statement about releasing the source code for their methodology or a link to a code repository. |
| Open Datasets | Yes | The Caltech-UCSD Birds (CUB-200-2001) dataset (Wah et al., 2011). The CARS196 dataset (Krause et al., 2013). The Stanford Online Product (Song et al., 2016) dataset |
| Dataset Splits | No | The paper explicitly states train and test splits for the datasets but does not provide specific details for a validation set split. It mentions 'early stopping' which implies the use of a validation set, but no split percentages or sample counts for it. |
| Hardware Specification | Yes | We ran our experiments on a single Tesla P100 GPU with 16GB RAM |
| Software Dependencies | No | The paper mentions using 'the Tensorflow package (Abadi et al., 2016)' and the 'Inception (Szegedy et al., 2015) network' but does not specify exact version numbers for these or any other software components. |
| Experiment Setup | Yes | Our batch size is set to n = o p = 18 70 = 1260, our method backpropagates the loss in Eq. (10) for all the examples in the batch. ... we perform 100 iterations of gradient descent. ... We perform 200 iterations of gradient descent. |