Generalization Bounds for (Wasserstein) Robust Optimization
Authors: Yang An, Rui Gao
NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this paper, we derive generalization bounds for robust optimization and Wasserstein robust optimization for Lipschitz and piecewise Hölder smooth loss functions under both stochastic and adversarial setting, assuming that the underlying data distribution satisfies transportation-information inequalities. The proofs are built on new generalization bounds for variation regularization (such as Lipschitz or gradient regularization) and its connection with robustness. |
| Researcher Affiliation | Academia | Yang An Department of Mathematics Columbia University yangan@math.columbia.edu Rui Gao Department of IROM University of Texas at Austin, rui.gao@mccombs.utexas.edu |
| Pseudocode | No | The paper is theoretical and does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any statement about releasing open-source code for the described methodology. |
| Open Datasets | No | The paper describes theoretical concepts and does not use or refer to specific publicly available datasets for empirical evaluation. The examples provided describe theoretical data distributions. |
| Dataset Splits | No | The paper is theoretical and does not discuss dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not mention any specific hardware used for running experiments. |
| Software Dependencies | No | The paper is theoretical and does not mention any specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details or hyperparameters. |