Optimal Cluster Recovery in the Labeled Stochastic Block Model

Authors: Se-Young Yun, Alexandre Proutiere

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We find the set of parameters such that there exists a clustering algorithm with at most s misclassified items in average under the general LSBM and for any s = o(n), which solves one open problem raised in [2]. We further develop an algorithm, based on simple spectral methods, that achieves this fundamental performance limit within O(npolylog(n)) computations and without the a-priori knowledge of the model parameters.
Researcher Affiliation Collaboration Se-Young Yun CNLS, Los Alamos National Lab. Los Alamos, NM 87545 syun@lanl.gov Alexandre Proutiere Automatic Control Dept., KTH Stockholm 100-44, Sweden alepro@kth.se
Pseudocode Yes The SP algorithm consists in two parts, and its detailed pseudo-code is presented at the beginning of the supplementary document (see Algorithm 1).
Open Source Code No The paper does not provide any information or links regarding the availability of open-source code for the described methodology.
Open Datasets No The paper is theoretical and focuses on mathematical models (Labeled Stochastic Block Model) rather than empirical evaluation on real-world datasets. No datasets, public or otherwise, are mentioned.
Dataset Splits No The paper is theoretical and does not involve empirical experiments with dataset splits.
Hardware Specification No The paper is theoretical and focuses on algorithm complexity, not empirical execution. Therefore, it does not specify any hardware used for experiments.
Software Dependencies No The paper does not mention any specific software dependencies or their version numbers required for reproducibility.
Experiment Setup No The paper is theoretical, presenting an algorithm and its mathematical analysis, rather than empirical experiments. Therefore, it does not include details on an experimental setup, hyperparameters, or training configurations.