Unconstrained Dynamic Regret via Sparse Coding
Authors: Zhiyu Zhang, Ashok Cutkosky, Yannis Paschalidis
NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The paper concludes with an application in fine-tuning time series forecasters, where unconstrained dynamic OCO is naturally motivated. Due to limited space, this is deferred to Appendix E, with experiments that support our theoretical results. |
| Researcher Affiliation | Academia | Zhiyu Zhang Harvard University zhiyuz@seas.harvard.edu Ashok Cutkosky Boston University ashok@cutkosky.com Ioannis Ch. Paschalidis Boston University yannisp@bu.edu |
| Pseudocode | Yes | Algorithm 1 Sparse coding with size 1 dictionary. ... Algorithm 2 Sparse coding with general dictionary. ... Algorithm 3 FREEGRAD [MK20, Definition 4]: scale-free and gradient adaptive unconstrained static OLO. ... Algorithm 4 Haar OLR with known time horizon. ... Algorithm 5 Anytime Haar OLR (Algorithm 4 with doubling trick). |
| Open Source Code | Yes | Code is available at https://github.com/zhiyuzz/Neur IPS2023-Sparse-Coding. |
| Open Datasets | Yes | Here we use the Jena weather forecasting dataset,16 which records the weather data at a German city, Jena, every 10 minutes. ... 16Available at https://www.bgc-jena.mpg.de/wetter/. |
| Dataset Splits | No | The paper uses synthetic data and a weather dataset, but does not explicitly provide information on train/validation/test splits, percentages, or sample counts for these datasets. |
| Hardware Specification | No | The paper does not specify any hardware details (e.g., CPU, GPU models, or memory) used for running the experiments. |
| Software Dependencies | No | The paper discusses algorithms and frameworks but does not specify any software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions). |
| Experiment Setup | Yes | Both algorithms require a confidence hyperparameter ε, and we set it to 1. |