LIMIP: Lifelong Learning to Solve Mixed Integer Programs
Authors: Sahil Manchanda, Sayan Ranu
AAAI 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We evaluate LIMIP on a series of NP-hard problems and establish that in comparison to existing baselines, LIMIP is up to 50% better when confronted with lifelong learning. |
| Researcher Affiliation | Academia | Sahil Manchanda and Sayan Ranu Department of Computer Science and Engineering Indian Institute of Technology, Delhi {sahil.manchanda,sayanranu}@cse.iitd.ac.in |
| Pseudocode | Yes | The detailed steps of training a sequence of tasks in lifelong fashion through LIMIP are described in Algorithm 1 in Appendix A.4. |
| Open Source Code | Yes | The codebase can be found on https://github.com/ideaiitd/Li MIP . |
| Open Datasets | No | The paper describes generating its own datasets based on known problems like Set Cover and Independent Set, but does not provide concrete access information (link, DOI, specific citation with authors/year) for publicly available versions of the exact datasets used for training. For example, it cites Balas and Ho (1980) for Set Cover, which defines the problem, not a specific dataset instance used for training. |
| Dataset Splits | Yes | For each task, we generate 150,000 branching samples extracted using 10,000 generated instances for training and 30000 validation/test samples generated using 2000 instances. |
| Hardware Specification | Yes | We use a system running on Intel Xeon 6248 processor with 96 cores and 1 NVIDIA A100 GPU with 40GB memory for our experiments. |
| Software Dependencies | Yes | We use SCIP (Gamrath et al. 2020) as the backend solver... Gamrath, G.; Anderson, D.; Bestuzheva, K.; Chen, W.-K.; Eifler, L.; Gasse, M.; Gemander, P.; Gleixner, A.; Gottwald, L.; and Halbig, K. 2020. The SCIP Optimization Suite 7.0. ZIB-Report. |
| Experiment Setup | Yes | 4.2 Experimental Setup and Parameters ... We use attention mechanism with 2 heads. We set the default buffer size to 500. For details of all parameters and system settings, we refer to App A.6. |