A Non-generative Framework and Convex Relaxations for Unsupervised Learning

Authors: Elad Hazan, Tengyu Ma

NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We give a novel formal theoretical framework for unsupervised learning with two distinctive characteristics. First, it does not assume any generative model and based on a worst-case performance metric. Second, it is comparative, namely performance is measured with respect to a given hypothesis class.
Researcher Affiliation Academia Elad Hazan Princeton University 35 Olden Street 08540 ehazan@cs.princeton.edu. Tengyu Ma Princeton University 35 Olden Street, NJ 08540 tengyu@cs.princeton.edu.
Pseudocode Yes Algorithm 1 group encoding/decoding for improper dictionary learning
Open Source Code No The paper does not contain any statements or links indicating that open-source code for the described methodology is provided.
Open Datasets No The paper is theoretical and does not describe empirical experiments on specific, publicly available datasets.
Dataset Splits No The paper is theoretical and does not discuss dataset splits for training, validation, or testing.
Hardware Specification No The paper does not provide any specific hardware details used for running experiments.
Software Dependencies No The paper does not list any specific software components with version numbers.
Experiment Setup No The paper is theoretical and does not describe specific experimental setup details such as hyperparameters or training configurations.