Double Momentum Method for Lower-Level Constrained Bilevel Optimization

Authors: Wanli Shi, Yi Chang, Bin Gu

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on two applications demonstrate the effectiveness of our proposed method.
Researcher Affiliation Academia 1School of Artificial Intelligence, Jilin University, China 2Mohamed bin Zayed University of Artificial Intelligence, UAE.
Pseudocode Yes Algorithm 1 DMLCBO
Open Source Code No The paper mentions implementing methods using Pytorch but does not provide any links to its own source code or state that it is open-source.
Open Datasets Yes In this experiment, we evaluate all the methods on the datasets MNIST, Fashion MNIST, Cod RNA, and Madelon 1. 1https://www.csie.ntu.edu.cn/ cjlin/libsvmtools/datasets/
Dataset Splits No The paper mentions 'training set' and 'testing set' but does not specify exact percentages, sample counts, or a detailed methodology for splitting the data into training, validation, and test sets.
Hardware Specification Yes We run all the methods 10 times on a PC with four 1080Ti GPUs.
Software Dependencies No The paper mentions software like Pytorch and JAX but does not specify version numbers for these or any other libraries or dependencies.
Experiment Setup Yes Detailed settings are given in our Appendix. For our method, we search the step size from the set {1, 10-1, 10-2, 10-3, 10-4, 10-5}. Following the default setting in (Ji et al., 2021), we set Q = 3 and η = 0.5 for our method. In addition, we set ηk = 1/(100+k), c1 = 10 and c2 = 10 for our method. For V-PBGD, RMD-PCD, and Approx, following the setting in (Pedregosa, 2016), we set the inner iteration number at 100. We set δ = 1e-6.