Unsupervised Learning for Combinatorial Optimization Needs Meta Learning
Authors: Haoyu Peter Wang, Pan Li
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the benefits of Meta-EGN via experiments within three benchmark CO problems (max clique, vertex cover, and max independent set) on multiple synthetic graphs and three real-world graph datasets, with the number of nodes ranging from 100 to 5000. |
| Researcher Affiliation | Academia | Haoyu Wang1, Pan Li1,2 1. Department of Electrical and Computer Engineering, Georgia Institute of Technology 2. Department of Computer Science, Purdue University |
| Pseudocode | Yes | Algorithm 1 Train Meta-EGN and Test Meta-EGN with/without Fine-tuning |
| Open Source Code | Yes | Our code is available at: https://github.com/Graph-COM/Meta_CO |
| Open Datasets | Yes | We conduct experiments on the MC and MVC problems over three real datasets Twitter (Leskovec & Krevl, 2014), COLLAB and IMDB (Yanardag & Vishwanathan, 2015) and two synthetic datasets with 200 and 500 nodes generated by the RB model (Xu, 2007). |
| Dataset Splits | Yes | For the real datasets, training/validation/test instances are randomly divided with the ratio of 7:1:2; For RB200 and RB500, 2000/100/100 graphs are generated for training/validation/test instances; ... We generate 15 graphs for validation and 20 graphs for each node degree configuration in D for test. |
| Hardware Specification | Yes | We run all experiments by using a Xeon(R) Gold 6248 CPU with 26 threads and a Quadro RTX 6000 GPU. |
| Software Dependencies | Yes | All the codes run on the Py Torch platform 1.9.0 (Paszke et al., 2019) and Py Torch Geometric framework 1.7.2 (Fey & Lenssen, 2019). |
| Experiment Setup | Yes | For the MC and MVC problems, we use 4-layer GIN (Xu et al., 2019) as the backbone network for both meta-EGN and EGN. We use 1e-3 as both the outer learning rate (γ) of Meta-EGN and the learning rate of EGN. ... For the MIS problem, we use 6-layer GIN. The outer learning rate (γ) of Meta-EGN and the learning rate of EGN are set as 1e-4. The inner learning rate (α) of Meta-EGN is always set as 5e-5. |