Constrained Bi-Level Optimization: Proximal Lagrangian Value Function Approach and Hessian-free Algorithm
Authors: Wei Yao, Chengming Yu, Shangzhi Zeng, Jin Zhang
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate the efficiency of LV-HBA through numerical experiments on synthetic problems, hyperparameter optimization for SVM and federated bilevel learning. Empirical results validate the superior practical performance of LV-HBA. |
| Researcher Affiliation | Collaboration | Wei Yao124, Chengming YU1, Shangzhi Zeng31, Jin Zhang21* 1National Center for Applied Mathematics Shenzhen, SUSTech, 2Department of Mathematics, SUSTech, 3Department of Mathematics and Statistics, UVic, 4CETC Key Laboratory of Smart City Modeling Simulation and Intelligent Technology, The Smart City Research Institute of CETC |
| Pseudocode | Yes | Algorithm 1 proximal Lagrangian Value function-based Hessian-free Bi-level Algorithm (LV-HBA) |
| Open Source Code | Yes | The code is available at https://github.com/SUSTech-Optimization/LV-HBA. |
| Open Datasets | Yes | We conduct experiments on the dataset diabetes from Dua et al. (2017) and the dataset fourclass from Ho & Kleinberg (1996). Utilizing dataset gisette (Guyon et al., 2004). |
| Dataset Splits | Yes | For dataset diabetes, we randomly partition it into training, validation, and testing subsets containing 500, 150, and 118 examples, respectively. Similarly, dataset fourclass is partitioned into training, validation, and testing subsets with 500, 150, and 212 examples, respectively. For dataset gisette, we segment it into training, validation, and testing subsets, comprising 400, 180, and 5420 examples, respectively. |
| Hardware Specification | Yes | All experiments were conducted using Python 3.8 on a computer with an Intel(R) Xeon(R) Gold 5218R CPU @ 2.10GHz CPU and an NVIDIA A100 GPU with 40GB memory GPU. |
| Software Dependencies | Yes | All experiments were conducted using Python 3.8... The hyperparameter optimization of SVM and the data hyper-cleaning experiments were performed using qpth version 0.0.11 and cvxpy version 1.2.0. The experiments were executed with opencv-python version 4.6.0.66. |
| Experiment Setup | Yes | Detailed experimental settings and parameter configurations can be found in Appendix A.1. In Figure 1, the step sizes are chosen as α = 0.005, β = 0.002, η = 0.03, γ1 = γ2 = 10, r = 1 with parameter ck = (k + 1)0.3. For LV-HBA, the step sizes are chosen as α = 0.02, β = 0.001, η = 0.1, γ1 = γ2 = 1, r = 1000 with parameter ck = (k + 1)0.3. |