Functionally Constrained Algorithm Solves Convex Simple Bilevel Problem

Authors: Huaqing Zhang, Lesi Chen, Jing Xu, Jingzhao Zhang

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we evaluate our proposed methods on two different bilevel problems with smooth objectives. We compare the performance of FC-Bi Osm with existing methods, including a-IRG [14], Bi-SG [19], CG-Bi O[13], AGM-Bi O [6], PB-APG[10], and Bisec-Bi O[28].
Researcher Affiliation Collaboration 1IIIS, Tsinghua University 2Shanghai Qizhi Institute 3Shanghai AI Lab
Pseudocode Yes Algorithm 1 Functionally Constrained Bilevel Optimizer (FC-Bi O)
Open Source Code No No explicit statement about the authors' own code being open-source is present in the provided text. The text only mentions that implementations of other methods (CG-Bi O and a-IRG) are based on code available online.
Open Datasets Yes We use the Wikipedia Math Essential dataset [22]... using the rcv1.binary dataset from LIBSVM [8, 16]
Dataset Splits Yes We uniformly sample m = 5000 instances as the training dataset (Atr, btr), and m instances as the validation dataset (Aval, bval).
Hardware Specification Yes All experiments are implemented using MATLAB R2022b on a PC running Windows 11 with a 12th Gen Intel(R) Core(TM) i7-12700H CPU (2.30 GHz) and 16GB RAM.
Software Dependencies Yes All experiments are implemented using MATLAB R2022b
Experiment Setup Yes We set ϵf = ϵg = 10 6. For our Algorithm 1, we take a slightly different implementation, that instead of setting the maximum number of iterations of the inner subroutine to be T = T/N, we preset T = 8000. If current xk already satisfies ψ(t, xk) ϵ/2, then terminate the inner subroutine directly. We adopt the warm-start strategy as described in Appendix B.2. We set L = 0 since f(x) is nonnegative. For FC-Bi OLip, we set η = 3 10 4.