Conformal Inverse Optimization

Authors: Bo Lin, Erick Delage, Timothy Chan

NeurIPS 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through experiments, we demonstrate strong empirical performance of conformal IO compared to classic IO and provide insights into modeling choices.
Researcher Affiliation Academia Bo Lin University of Toronto blin@mie.utoronto.ca Erick Delage HEC Montréal Mila Québec AI Institute erick.delage@hec.ca Timothy C. Y. Chan University of Toronto Vector Institute tcychan@mie.utoronto.ca
Pseudocode No The paper describes its methods through text and mathematical formulations but does not include any explicit pseudocode or algorithm blocks.
Open Source Code Yes Data and source code available at https://anonymous.4open.science/ r/Conformal IO-B776.
Open Datasets No For both problems, we randomly generate a ground-truth parameters θ and a dataset of N = 1000 DMs. The paper does not provide concrete access information (link, DOI, formal citation) for a publicly available or open dataset. Instead, it describes generating synthetic data.
Dataset Splits Yes Unless otherwise noted, experiments are based on a 60/20/20 train-validation-test split and are repeated 10 times with different random seeds.
Hardware Specification Yes All the algorithms are implemented and test using Python 3.9.1 on a Mac Book Pro with an Apple M1 Pro processor and 16 GB of RAM.
Software Dependencies Yes All the algorithms are implemented and test using Python 3.9.1 on a Mac Book Pro with an Apple M1 Pro processor and 16 GB of RAM. Optimization models are implemented with Gurobi 9.5.2.
Experiment Setup Yes Unless otherwise noted, experiments are based on a 60/20/20 train-validation-test split and are repeated 10 times with different random seeds. [...] Hyper-parameters are tuned using a separate validation set of 200 decision data points. Batch size is set to 64. We use the Adam optimizer with an initial learning rate of 0.1. We train the model for 20 epochs.