Reusing Combinatorial Structure: Faster Iterative Projections over Submodular Base Polytopes

Authors: Jai Moondra, Hassan Mortagy, Swati Gupta

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our theoretical results show orders of magnitude reduction in runtime in preliminary computational experiments. The code for our computations can be found on Git Hub. In our first experiment, we iteratively compute the Euclidean projections of 500 randomly generated points on the permutahedron. We normalized each OMD-UAFW run time to be 1000, and run times for all other variants in this run are correspondingly scaled in Figures 4-middle and 4-right.
Researcher Affiliation Academia Jai Moondra Georgia Institute of Technology jmoondra3@gatech.edu Hassan Mortagy Georgia Institute of Technology hmortagy@gatech.edu Swati Gupta Georgia Institute of Technology swatig@gatech.edu
Pseudocode Yes Algorithm 1 Adaptive Away-steps Frank-Wolfe (A2FW) Input: Submodular f : 2E R, (µ, L)-strongly convex and smooth h : B(f) R, chain of tight cuts S (e.g., using INFER1), z(0) B(f) {x(S) = f(S), S S} with active set A0, tolerance ε.
Open Source Code Yes The code for our computations can be found on Git Hub .
Open Datasets No The paper uses "randomly generated points" and "noisy (linear) loss settings" which are synthetic data generated for the experiments. It does not provide a link, DOI, or formal citation to a publicly available dataset.
Dataset Splits No The paper describes experiments on randomly generated points and noisy loss settings for online learning, but it does not specify any train/validation/test dataset splits.
Hardware Specification Yes All our computations were performed on a single machine with 256GB RAM and Intel Xeon E5-2630 v4 @ 2.20GHz CPU.
Software Dependencies No The paper cites "Gurobi optimizer reference manual version 7.5", but it does not explicitly state that this version was used in their experiments, nor does it list other software dependencies with specific version numbers.
Experiment Setup Yes In our first experiment, we iteratively compute the Euclidean projections of 500 randomly generated points on the permutahedron. The cloud of these 500 points is generated by fixing a random mean point and perturbing it using multivariate Gaussian noise with mean zero and standard deviation ϵ = 1/50. We consider a time horizon of T = 1000, and construct two noisy (linear) loss settings. For each of the two loss settings, we run Online Frank-Wolfe (OFW) and five variants of Online Mirror Descent (OMD) using the toolkit proposed: (1) OMD-UAFW: OMD with projection using vanilla away-step Frank-Wolfe (baseline), (2) OMD-ASAFW: OMD with AFW with reused active sets, (3) OMD-TSAFW: OMD with AFW with INFER, RESTRICT, and ROUNDING, (4) OMD-A2FW OMD with A2FW, and (5) OMD-PAV: OMD with PAV.