A Quantum-inspired Classical Algorithm for Separable Non-negative Matrix Factorization

Authors: Zhihuai Chen, Yinan Li, Xiaoming Sun, Pei Yuan, Jialin Zhang

IJCAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical In this paper, inspired by recent development on dequantizing techniques, we propose a new classical algorithm for separable NMF problem. Our new algorithm runs in polynomial time in the rank and logarithmic in the size of input matrices, which achieves an exponential speedup in the low-rank setting.
Researcher Affiliation Academia 1CAS Key Lab of Network Data Science and Technology, Institute of Computing Technology, Chinese Academy of Sciences, 100190, Beijing, China 2University of Chinese Academy of Sciences, 100049, Beijing, China 3Centrum Wiskunde & Informatica and Qu Soft, Science Park 123, 1098XG Amsterdam, Netherlands {chenzhihuai, sunxiaoming, yuanpei, zhangjialin}@ict.ac.cn, yinan.li@cwi.nl
Pseudocode Yes Algorithm 1 Fast Anchors Seeking Algorithm; Algorithm 2 Rejection Sampling for D ˆUy
Open Source Code No The paper does not provide any links or explicit statements about the availability of open-source code for the described methodology.
Open Datasets No The paper is theoretical and introduces a 'matrix sample model' rather than using specific publicly available datasets for training or evaluation. No dataset information is provided.
Dataset Splits No The paper does not provide information about training/validation/test dataset splits, as it focuses on theoretical analysis rather than empirical evaluation.
Hardware Specification No The paper does not mention any specific hardware used for running experiments, consistent with its theoretical nature.
Software Dependencies No The paper does not provide specific version numbers for any software dependencies.
Experiment Setup No The paper focuses on theoretical algorithm design and analysis, and therefore does not include details on experimental setup such as hyperparameters or training settings.