Smoothed Online Convex Optimization Based on Discounted-Normal-Predictor

Authors: Lijun Zhang, Wei Jiang, Jinfeng Yi, Tianbao Yang

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We also conduct preliminary experiments to verify our theories, and present the results in Appendix B.
Researcher Affiliation Collaboration Lijun Zhang1,2, Wei Jiang1, Jinfeng Yi3, Tianbao Yang4 1National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China 2Peng Cheng Laboratory, Shenzhen 518055, China 3JD AI Research, Beijing, China 4Department of Computer Science and Engineering, Texas A&M University, College Station, USA
Pseudocode Yes Algorithm 1 Discounted-Normal-Predictor; Algorithm 2 Discounted-Normal-Predictor with conservative updating (DNP-cu); Algorithm 3 Combiner; Algorithm 4 Smoothed OGD
Open Source Code No Did you include the code, data, and instructions needed to reproduce the main experimental results (either in the supplemental material or as a URL)? [No]
Open Datasets No The paper mentions preliminary experiments are conducted but does not provide concrete access information (link, DOI, or specific citation) for a publicly available dataset in the main text or appendix.
Dataset Splits No The paper does not provide specific dataset split information (exact percentages, sample counts, citations to predefined splits, or detailed splitting methodology) needed to reproduce the data partitioning.
Hardware Specification Yes The experiments are conducted on a single GPU (NVIDIA GeForce RTX 2080 Ti) with 11GB memory and an Intel Core i7-9700K CPU.
Software Dependencies No We implement our algorithm by PyTorch. (No version number provided for PyTorch).
Experiment Setup Yes We choose different step sizes η(i) for OGD: 0.1, 0.01, 0.001, 0.0001. The number of experts K is set to 4. We choose λ = 1 and the total number of iterations T = 5000. For each algorithm, we run it for 50 times and report the mean value and standard deviation.