Zeroth-Order Optimization for Composite Problems with Functional Constraints
Authors: Zichong Li, Pin-Yu Chen, Sijia Liu, Songtao Lu, Yangyang Xu7453-7461
AAAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we conduct numerical experiments to demonstrate the performance of our proposed ZO-i ALM. We consider the problem of resource allocation in sensor networks and the adversarial example generation problem. |
| Researcher Affiliation | Collaboration | Zichong Li1, Pin-Yu Chen2, Sijia Liu3, Songtao Lu2, Yangyang Xu1 1Department of Mathematical Sciences, Rensselaer Polytechnic Institute 2IBM Research 3Department of Computer Science and Engineering, Michigan State University |
| Pseudocode | Yes | Algorithm 1: Zeroth-order inexact augmented Lagrangian method (ZO-i ALM) and Algorithm 2: Zeroth-order accelerated proximal coordinate update for (15): ZO-APCU(G, H, µ, L, ε) |
| Open Source Code | No | The paper does not provide a direct link to source code or explicitly state that the code for the described methodology is publicly available. |
| Open Datasets | Yes | In the test, we use the ovarian cancer dataset (Conrads et al. 2004; Petricoin III et al. 2002) that are from m = 216 patients. Each data point has d = 4, 000 features and a label indicating whether the corresponding patient has ovarian cancer. We first use MATLAB s built-in lasso function (with λ = 0.01) to train a LASSO regression model parameterized by θ. |
| Dataset Splits | No | The paper describes the datasets used and some parameters but does not provide specific details on training, validation, or test dataset splits or splitting methodology. |
| Hardware Specification | Yes | All the tests were performed in MATLAB 2019b on a Macbook Pro with 4 cores and 16GB memory. |
| Software Dependencies | Yes | All the tests were performed in MATLAB 2019b on a Macbook Pro with 4 cores and 16GB memory. |
| Experiment Setup | Yes | We set d = 80, λ = 0.5, and ε = 0.5. ... In each call to the ZO-i PPM subroutine, we set the smoothness parameter to ˆLk = 50 + 0.3βk. We tune the parameters of ZO-Ada MM to α = 1, β1 = 0.75, β2 = 1, and fix the step size to 0.01 in ZO-Prox SGD. For each method, we choose a = 10 6 as the sampling radius and wk = 1/c(xk) as the dual step size. and In (23), we set λ = 0.01 and ε = 0.1. Due to the large variable dimension, we set ε = 1 in stopping conditions. ... In each method, we set a = 10 6 as the sampling radius and wk = 1/c(xk) as the dual step size. |