Self-Supervised Primal-Dual Learning for Constrained Optimization
Authors: Seonho Park, Pascal Van Hentenryck
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments show that, on a set of nonlinear optimization benchmarks, PDL typically exhibits negligible constraint violations and minor optimality gaps, and is remarkably close to the ALM optimization. |
| Researcher Affiliation | Academia | H. Milton Stewart School of Industrial & Systems Engineering, Georgia Institute of Technology seonho.park@gatech.edu, pascal.vanhentenryck@isye.gatech.edu |
| Pseudocode | Yes | Algorithm 1 Primal-Dual Learning (PDL) |
| Open Source Code | No | The paper does not provide a direct link or explicit statement for the open-source code of the described methodology (PDL). It only provides a link for a baseline method: '3https://github.com/locuslab/DC3' |
| Open Datasets | Yes | Table 3 reports the performance results of the case56 and case118 in PGLib (Babaeinejadsarookolaee et al. 2019). |
| Dataset Splits | Yes | Overall, 10,000 instances were generated and split into training/testing/validation datasets with the ratio of 10:1:1. |
| Hardware Specification | Yes | The implementation is based on Py Torch and the training was conducted using a Tesla RTX6000 GPU on a machine with Intel Xeon 2.7GHz. |
| Software Dependencies | Yes | Gurobi 9.5.2 was used as the optimization solver to produce the instance data for the supervised baselines, and also served as the reference for computing optimality gaps. |
| Experiment Setup | Yes | For training the models, the Adam optimizer (Kingma and Ba 2014) with the learning rate of 1e-4 was used. Other hyperparameters of PDL and the baselines were tuned using a grid search. |