A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization

Authors: Sebastian Sanokowski, Sepp Hochreiter, Sebastian Lehner

ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.
Researcher Affiliation Collaboration 1Institute for Machine Learning, Johannes Kepler University, Linz, Austria 2ELLIS Unit Linz 3NXAI Gmb H.
Pseudocode No The paper describes the methods in prose and mathematical equations but does not include any explicitly labeled 'Pseudocode' or 'Algorithm' blocks.
Open Source Code Yes *Code is available at https://github.com/ml-jku/ DIff UCO.
Open Datasets Yes For the experiments on MIS and Max Cl, graphs are generated by the so-called RB-Model (Xu et al., 2005)... On MDS and Max Cut randomly generated Barabasi-Albert (BA) graphs (Barab asi and Albert, 1999) are used. We furthermore evaluate our method on the Gset Maxcut dataset... (Ye, 2003).
Dataset Splits Yes For both of these graph types, 4000 graphs are used for training and 500 graphs are used for the validation and testing, respectively.
Hardware Specification Yes On RB-large MIS experiments are for example conducted on one A100 GPU with 80 GB. All time measurements were conducted on an A100 NVIDIA GPU. Self conducted Gurobi evaluations on Max Cut are run on a Intel Xeon Platinum 8168 @ 2.70GHz CPU with 24 cores.
Software Dependencies No The paper mentions 'The code for this research project is based on jax (Bradbury et al., 2018)', but it does not provide specific version numbers for JAX or any other software libraries or dependencies used in the experiments.
Experiment Setup Yes A table with all hyperparameters on each dataset is given in Tab. 7.