Parameterizing Branch-and-Bound Search Trees to Learn Branching Policies

Authors: Giulia Zarpellon, Jason Jo, Andrea Lodi, Yoshua Bengio3931-3939

AAAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on MILP benchmark instances clearly show the advantages of incorporating an explicit parameterization of the state of the B&B search tree to modulate the branching decisions, in terms of both higher accuracy and smaller B&B trees.
Researcher Affiliation Academia 1Polytechnique Montr eal 2Mila 3Universit e de Montr eal
Pseudocode No The paper describes the algorithms and models used (e.g., B&B, DNN architectures), but it does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes Code, data and supplementary materials can be found at: https://github.com/ds4dm/branch-search-trees
Open Datasets Yes Thus we select 27 heterogeneous problems from realworld MILP benchmark libraries (Bixby et al. 1998; Koch et al. 2011; Gleixner et al. 2019; Mittelmann 2020), focusing on instances whose tree exploration is on average relatively contained (in the tens/hundreds of thousands nodes, max.) and whose optimal value is known.
Dataset Splits Yes The final composition of train, validation and test sets is summarized in Table 1(b).
Hardware Specification No Further details on the solver parameters and hardware settings are reported in the SM. The main text does not specify exact hardware components like CPU/GPU models or memory.
Software Dependencies Yes We use SCIP 6.0.1.
Experiment Setup Yes Our hyper-parameter search spans: learning rate LR {0.01, 0.001, 0.0001}, hidden size h {32, 64, 128, 256}, and depth d {2, 3, 5}. The factor by which units of No Tree are reduced is 2, and we fix INF = 8. We use Py Torch (Paszke et al. 2019) to train the models for 40 epochs, reducing LR by a factor of 10 at epochs 20 and 30. We train both IL policies using ADAM (Kingma and Ba 2015) with default β1 = 0.9, β2 = 0.999, and weight decay 1 10 5.