Let the Flows Tell: Solving Graph Combinatorial Problems with GFlowNets

Authors: Dinghuai Zhang, Hanjun Dai, Nikolay Malkin, Aaron C. Courville, Yoshua Bengio, Ling Pan

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through extensive experiments on a variety of different CO tasks with synthetic and realistic data, we demonstrate that GFlow Net policies can efficiently find high-quality solutions.
Researcher Affiliation Collaboration Dinghuai Zhang Mila Hanjun Dai Google Deep Mind Nikolay Malkin, Aaron Courville, Yoshua Bengio, Ling Pan Mila
Pseudocode Yes Algorithm 1 GFlow Net training algorithm
Open Source Code Yes Our implementation is open-sourced at https://github.com/zdh Narsil/GFlow Net-Comb Opt.
Open Datasets Yes For MIS problems, we follow the setup in the MIS benchmark from Böther et al. (2022)... we take the more complicated RB graphs (Xu & Li, 2000) following Karalias & Loukas (2020). For realistic data, we take the SATLIB dataset (Hoos et al., 2000)... For the other two tasks... we adopt BA graphs (Barabási & Albert, 1999) following Sun et al. (2022).
Dataset Splits Yes For all datasets, we use a validation set and a test set, both with 500 data points. ... We choose the hyperparameters of each algorithm based on the validation performance.
Hardware Specification Yes Regarding hardware, we use NVIDIA Tesla V100 Volta GPUs.
Software Dependencies No The paper mentions "pygurobi python package" and "cvxpy python package" but does not provide specific version numbers for these or other key software components used in the experiments.
Experiment Setup Yes For the GFlow Net architecture, we use graph isomorphism network (Xu et al., 2019, GIN) with 5 hidden layers and 256 dimensional hidden size... We use the Adam optimizer with default 1 10 3 learning rate without hyperparameter tuning. ... with batchsize equals 64... The training keeps 20 epochs... we use an inverse temperature of 500 for all the tasks.