Learning to Search in Branch and Bound Algorithms
Authors: He He, Hal Daume III, Jason M Eisner
NeurIPS 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We apply our algorithm to linear programming based branch-and-bound for solving mixed integer programs (MIP). We compare our method with one of the fastest open-source solvers, SCIP; and a very efficient commercial solver, Gurobi. We demonstrate that our approach achieves better solutions faster on four MIP libraries. |
| Researcher Affiliation | Academia | He He Hal Daum e III Department of Computer Science University of Maryland College Park, MD 20740 {hhe,hal}@cs.umd.edu Jason Eisner Department of Computer Science Johns Hopkins University Baltimore, MD 21218 jason@cs.jhu.edu |
| Pseudocode | Yes | Algorithm 1 Policy Learning (S, P ) and Algorithm 2 Running B&B policies and collect example for problem Q |
| Open Source Code | No | The paper does not provide an explicit statement about releasing its own source code, nor does it provide a direct link to a code repository for its methodology. |
| Open Datasets | Yes | We use four problem libraries suggested in [12]. MIK2 [13] is a set of MILP problems with knapsack constraints. Regions and Hybrid are sets of problems of determining the winner of a combinatorial auction, generated from different distributions by the Combinatorial Auction Test Suite (CATS)3 [14]. CORLAT [15] is a real dataset used for the construction of a wildlife corridor for grizzly bears in the Northern Rockies region. |
| Dataset Splits | Yes | Each problem set is split into training, test and development sets. |
| Hardware Specification | No | The paper mentions using SCIP and Gurobi solvers and Cplex to solve LPs, but it does not specify the hardware (e.g., CPU, GPU models, memory) on which these experiments were run. |
| Software Dependencies | Yes | Our solver is implemented based on SCIP (Version 3.1.0) (using Cplex 12.6 as the LP solver), and Gurobi (Version 5.6.2). ... We use LIBLINEAR [16] in the step of training classifiers in Algorithm 1. |
| Experiment Setup | Yes | For each problem set, we split its training set into equal-sized subsets randomly and run DAgger on one subset in each iteration until we have taken two passes over the entire set. ...the example weights at each level of the B&B tree decay exponentially at rate 2.68/D where D is the maximum depth4... For pruning policy training, we put a higher weight (tuned from {1, 2, 4, 8}) on the class prune to counter data imbalance... The class weight and SVM s penalty parameter C are tuned for each library on its development set. ... To combine these features with depth of the node, we partition the tree into 10 uniform levels, and features at each level are stacked together. ... We compare with SCIP (Version 3.1.0) (using Cplex 12.6 as the LP solver), and Gurobi (Version 5.6.2). |