Orthogonal NMF through Subspace Exploration

Authors: Megasthenis Asteris, Dimitris Papailiopoulos, Alexandros G. Dimakis

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We evaluate our algorithms on several real and synthetic datasets and show that their performance matches or outperforms the state of the art.
Researcher Affiliation Academia Megasthenis Asteris The University of Texas at Austin megas@utexas.eduDimitris Papailiopoulos University of California, Berkeley dimitrisp@berkeley.eduAlexandros G. Dimakis The University of Texas at Austin dimakis@austin.utexas.edu
Pseudocode Yes Algorithm 1 Low Rank NNPCA Algorithm 2 ONMFS Algorithm 3 Local Opt W
Open Source Code No The paper does not provide any explicit statements or links indicating that the source code for the described methodology is publicly available.
Open Datasets Yes CBCL Dataset. The CBCL dataset [30] contains 2429, 19 19 pixel, gray scale face images. It has been used in the evaluation of all three methods [16, 17, 27]. Additional Datasets. We solve the NNPCA problem on various datasets obtained from [31].
Dataset Splits No The paper does not provide specific details on dataset splits for training, validation, or testing, nor does it mention cross-validation setups.
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, memory) used to run the experiments.
Software Dependencies No The paper mentions general tools like SVD, but it does not specify any software dependencies (e.g., libraries, frameworks) with version numbers.
Experiment Setup Yes For our algorithm, we use a sketch of rank r = 4 of the (centered) input data. Further we apply an early termination criterion; execution is terminated if no improvement is observed in a number of consecutive iterations (samples). We set a high penalty (α = 1e10) to promote orthogonality. We run ONMF methods with target dimension k = 5. For the methods that involved random initialization, we run 10 averaging iterations per Monte Carlo trial. We compare our algorithm with several state-of-the-art ONMF algorithms i) the O-PNMF algorithm of [13] (for 1000 iterations), and ii) the more recent ONP-MF iii) EM-ONMF algorithms of [11, 32] (for 1000 iterations).