Leveraging Predictions in Smoothed Online Convex Optimization via Gradient-based Algorithms
Authors: Yingying Li, Na Li
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Lastly, we numerically test the performance of RHIG on quadrotor tracking problems. |
| Researcher Affiliation | Academia | Yingying Li SEAS Harvard University Cambridge, MA, USA. 02138 yingyingli@g.harvard.edu Na Li SEAS Harvard University Cambridge, MA, USA. 02138 nali@seas.harvard.edu |
| Pseudocode | Yes | Algorithm 2: Receding Horizon Inexact Gradient (RHIG) |
| Open Source Code | No | The paper does not provide an unambiguous statement of releasing open-source code for the described methodology, nor does it include a direct link to a code repository. |
| Open Datasets | No | The paper describes generating data based on models (e.g., 'quadrotor tracking of a vertically moving target [37]', 'target θt follows: θt = yt + qt, where yt = γyt 1 + et is an autoregressive process with noise et [38]') but does not provide concrete access information (link, DOI, repository, or specific dataset name with access) for a publicly available dataset. |
| Dataset Splits | No | The paper describes numerical experiments but does not provide specific details on training, validation, or test dataset splits (e.g., percentages, sample counts, or cross-validation setup). |
| Hardware Specification | No | The paper does not provide specific hardware details (exact GPU/CPU models, processor types, or detailed computer specifications) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details, such as library or solver names with version numbers, needed to replicate the experiment. |
| Experiment Setup | No | The paper discusses model parameters for the quadrotor tracking problem (e.g., 'min PT t=1 1 2(α(xt θt)2 + β(xt xt 1)2)', 'γ = 0.3 and γ = 0.7') but does not provide specific experimental setup details such as concrete hyperparameter values (e.g., learning rate, batch size) or training configurations for the RHIG algorithm. |