Large-Scale Quadratically Constrained Quadratic Program via Low-Discrepancy Sequences

Authors: Kinjal Basu, Ankan Saha, Shaunak Chatterjee

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results are also shown to prove scalability as well as improved quality of approximation in practice.
Researcher Affiliation Industry Kinjal Basu, Ankan Saha, Shaunak Chatterjee Linked In Corporation Mountain View, CA 94043 {kbasu, asaha, shchatte}@linkedin.com
Pseudocode Yes Algorithm 1 Point Simulation on S
Open Source Code No The paper does not provide any statement or link indicating that the source code for the described methodology is publicly available.
Open Datasets No The paper states 'We randomly sample A, B, x0 and b keeping the problem convex.' indicating synthetic or generated data without providing access to a specific public dataset or its generation code.
Dataset Splits No The paper does not provide specific details on training, validation, or test dataset splits.
Hardware Specification No The paper does not specify any hardware details (e.g., GPU/CPU models, memory) used for running the experiments.
Software Dependencies No The paper mentions 'Operator Splitting or ADMM [10, 26]' and 'cvx in MATLAB using via Se Du Mi and SDPT3' but does not provide specific version numbers for any of these software components.
Experiment Setup Yes The stopping criteria throughout our simulation is same as that of Operator Splitting algorithm as presented in [26]. Throughout our simulations, we have chosen η = 2 and the number of optimal points as N = max(1024, 2m), where m is the smallest integer such that 2m ≥ 10n.