Think out of the "Box": Generically-Constrained Asynchronous Composite Optimization and Hedging

Authors: Pooria Joulani, András György, Csaba Szepesvari

NeurIPS 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We present two new algorithms, ASYNCADA and HEDGEHOG, for asynchronous sparse online and stochastic optimization. ASYNCADA is, to our knowledge, the first asynchronous stochastic optimization algorithm with finite-time datadependent convergence guarantees for generic convex constraints.
Researcher Affiliation Industry Pooria Joulani Deep Mind, UK pjoulani@google.com András György Deep Mind, UK agyorgy@google.com Csaba Szepesvári Deep Mind, UK szepi@google.com
Pseudocode Yes Algorithm 1: ASYNCADA: Asynchronous Composite Adaptive Dual Averaging Algorithm 2: HEDGEHOG!: Asynchronous Stochastic Exponentiated Gradient.
Open Source Code No The paper does not provide a link to or explicitly state the release of open-source code for the described methodology.
Open Datasets No The paper is theoretical and does not conduct experiments using specific datasets. The mention of 'training data' in Section 1 refers to general machine learning methods, not data used in this paper's research.
Dataset Splits No The paper is theoretical and does not conduct experiments with dataset splits. Therefore, it does not specify training/test/validation splits.
Hardware Specification No The paper is theoretical and does not describe experimental hardware specifications.
Software Dependencies No The paper is theoretical and does not describe specific software dependencies with version numbers for experimental reproducibility.
Experiment Setup No The paper is theoretical and does not describe experimental setup details such as hyperparameters or training configurations.