Structured Semidefinite Programming for Recovering Structured Preconditioners

Authors: Arun Jambulapati, Jerry Li, Christopher Musco, Kirankumar Shiragur, Aaron Sidford, Kevin Tian

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical We develop a general framework for finding approximately-optimal preconditioners for solving linear systems. Leveraging this framework we obtain improved runtimes for fundamental preconditioning and linear system solving problems including: Diagonal preconditioning. We give an algorithm which, given positive definite K Rd d with nnz(K) nonzero entries, computes an ϵ-optimal diagonal preconditioner in time e O(nnz(K) poly(κ , ϵ 1)), where κ is the optimal condition number of the rescaled matrix. Structured linear systems. We give an algorithm which, given M Rd d that is either the pseudoinverse of a graph Laplacian matrix or a constant spectral approximation of one, solves linear systems in M in e O(d2) time. Our diagonal preconditioning results improve state-of-the-art runtimes of Ω(d3.5) attained by general-purpose semidefinite programming, and our solvers improve state-of-the-art runtimes of Ω(dω) where ω > 2.3 is the current matrix multiplication constant. We attain our results via new algorithms for a class of semidefinite programs (SDPs) we call matrix-dictionary approximation SDPs, which we leverage to solve an associated problem we call matrix-dictionary recovery.
Researcher Affiliation Collaboration Arun Jambulapati Simons Institute jmblpati@berkeley.edu Christopher Musco New York University cmusco@nyu.edu Jerry Li Microsoft Research jerrl@microsoft.com Kirankumar Shiragur Broad Institute of MIT and Harvard shiragur@stanford.edu Aaron Sidford Stanford University sidford@stanford.edu Kevin Tian University of Texas at Austin kjtian@cs.utexas.edu
Pseudocode Yes Algorithm 1: Decide Structured MPC({Mi}i [n], κ, Apack, δ, ϵ)
Open Source Code No The paper does not provide any concrete access information (e.g., specific repository links, explicit statements of code release, or mention of code in supplementary materials) for open-source code.
Open Datasets No The paper is theoretical and focuses on algorithm design and analysis. It does not mention the use of any specific dataset for training, validation, or testing.
Dataset Splits No The paper is theoretical and focuses on algorithm design and analysis. It does not mention any training, validation, or test dataset splits.
Hardware Specification No The paper does not provide specific hardware details (e.g., exact GPU/CPU models, processor types) used for running experiments, as the paper is theoretical and does not conduct empirical experiments requiring specific hardware.
Software Dependencies No The paper does not provide specific ancillary software details, such as library or solver names with version numbers, needed to replicate experiments. The paper is theoretical.
Experiment Setup No The paper is theoretical and focuses on algorithm design and analysis. It does not describe an experimental setup with specific hyperparameters, training configurations, or system-level settings.