A Computationally Efficient Method for Learning Exponential Family Distributions

Authors: Abhin Shah, Devavrat Shah, Gregory Wornell

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our work is theoretical in nature.
Researcher Affiliation Academia Abhin Shah MIT abhin@mit.edu Devavrat Shah MIT devavrat@mit.edu Gregory W. Wornell MIT gww@mit.edu
Pseudocode Yes Algorithm 1: Projected Gradient Descent
Open Source Code No The paper states that it is theoretical in nature and does not conduct experiments. Therefore, it does not provide open-source code for its methodology.
Open Datasets No The paper states that it is theoretical in nature and does not conduct experiments. It does not mention using any datasets, public or otherwise, for training or evaluation.
Dataset Splits No The paper states that it is theoretical in nature and does not conduct experiments. It does not provide any training/test/validation dataset splits.
Hardware Specification No The paper states that it is theoretical in nature and does not conduct experiments. Therefore, it does not describe any specific hardware used.
Software Dependencies No The paper states that it is theoretical in nature and does not conduct experiments. Therefore, it does not provide specific software dependencies or version numbers.
Experiment Setup No The paper states that it is theoretical in nature and does not conduct experiments. Therefore, it does not provide specific experimental setup details like hyperparameters or training settings.