Sym-NCO: Leveraging Symmetricity for Neural Combinatorial Optimization

Authors: Minsu Kim, Junyoung Park, Jinkyoo Park

NeurIPS 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental This section provides the experimental results of Sym-NCO for TSP, CVRP, PCTSP, and OP.
Researcher Affiliation Academia Minsu Kim Junyoung Park Jinkyoo Park Korea Advanced Institute of Science and Technology (KAIST) Dept. Industrial & Systems Engineering {min-su, Junyoungpark, jinkyoo.park}@kaist.ac.kr
Pseudocode No The paper does not contain any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Our source code is available at https://github.com/alstn12088/Sym-NCO.
Open Datasets Yes Dataset and Computing Resources. We use the benchmark dataset [21] to evaluate the performance of the solvers.
Dataset Splits No The paper mentions using a 'benchmark dataset [21]' and shows 'Validation Cost' in figures, implying the use of validation data. However, it does not explicitly state the dataset splits (e.g., percentages or sample counts) within the paper's text.
Hardware Specification Yes To train the neural solvers, we use Nvidia A100 GPU. To evaluate the inference speed, we use an Intel Xeon E5-2630 CPU and Nvidia RTX2080Ti GPU to make fair comparisons with the existing methods as proposed in [26].
Software Dependencies No The paper mentions the use of 'neural solvers' but does not specify any software dependencies (e.g., programming languages, libraries, or frameworks) with version numbers.
Experiment Setup Yes Hyperparameters. We apply Sym-NCO to POMO, AM, and Pointer Net. To make fair comparisons, we use the same network architectures and training-related hyperparameters from their original papers to train their Sym-NCO-augmented models. Please refer to Appendix Appendix C.1 for more details.