Distributionally Robust Linear Quadratic Control

Authors: Bahar Taskesen, Dan Iancu, Çağıl Koçyiğit, Daniel Kuhn

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We propose a numerical solution method that efficiently characterizes this optimal control policy. ... Lastly, we implement the algorithm leveraging Py Torch s automatic differentiation module and we find that it yields uniformly lower runtimes than a direct method (based on solving semidefinite programs) across all problem horizons. ... 5. Numerical Experiments All experiments are run on an Intel i7-8700 CPU (3.2 GHz) machine with 16GB RAM. ... Figure 1a illustrates the execution time for both approaches as a function of the planning horizon T; runs where MOSEK exceeds 100s are not reported. Figure 1b visualizes the empirical convergence behavior of the Frank-Wolfe algorithm.
Researcher Affiliation Academia Bahar Ta skesen EPFL bahar.taskesen@epfl.ch Dan A. Iancu Stanford University daniancu@stanford.edu Ça gıl Koçyi git University of Luxembourg cagil.kocyigit@uni.lu Daniel Kuhn EPFL daniel.kuhn@epfl.ch
Pseudocode Yes A detailed description of the proposed Frank-Wolfe method is given in Algorithm 1 below. Algorithm 1 Frank-Wolfe algorithm for solving (12)
Open Source Code Yes The code is publicly available in the Github repository https: //github.com/RAO-EPFL/DR-Control.
Open Datasets No The paper does not use a pre-existing publicly available or open dataset. Instead, it describes a process for generating random nominal covariance matrices for its experiments: 'The nominal covariance matrices of the exogenous uncertainties are constructed randomly and with eigenvalues in the interval [1, 2] (so as to ensure they are positive definite).'
Dataset Splits No The paper does not explicitly provide training/test/validation dataset splits. The research is in control theory and optimization, which typically involves system modeling and simulation rather than fixed datasets with predefined splits commonly found in machine learning.
Hardware Specification Yes All experiments are run on an Intel i7-8700 CPU (3.2 GHz) machine with 16GB RAM.
Software Dependencies No The paper mentions 'Python 3.8.6', 'CVXPY [1, 14]', 'MOSEK [37]', 'Pymanopt [48]', and 'Py Torch s automated differentiation module [39, 40]'. While Python has a specific version, multiple other key software components are mentioned without explicit version numbers in the text.
Experiment Setup Yes Consider a class of distributionally robust LQG problems with n = m = p = 10. We set At = 0.1 A to have ones on the main diagonal and the superdiagonal and zeroes everywhere else (Ai,j = 1 if i = j or i = j 1 and Ai,j = 0 otherwise), and the other matrices to Bt = Ct = Qt = Rt = Id. The Wasserstein radii are set to ρx0 = ρwt = ρvt = 10 1. The nominal covariance matrices of the exogenous uncertainties are constructed randomly and with eigenvalues in the interval [1, 2] (so as to ensure they are positive definite). ... we set a stopping criterion corresponding to an optimality gap below 10 3 and we run the Frank-Wolfe method with δ = 0.95.