Riemannian Accelerated Zeroth-order Algorithm: Improved Robustness and Lower Query Complexity
Authors: Chang He, Zhaoye Pan, Xiao Wang, Bo Jiang
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | In this section, we conduct experiments to demonstrate the robustness and efficiency of RAZGD. |
| Researcher Affiliation | Academia | 1School of Information Management and Engineering, Shanghai University of Finance and Economics 2Key Laboratory of Interdisciplinary Research of Computation and Economics, Shanghai University of Finance and Economics, Ministry of Education 3Dishui Lake Advanced Finance Institute, Shanghai University of Finance and Economics. |
| Pseudocode | Yes | Algorithm 1 Riemannian Accelerated Zeroth-order Gradient Descent Algorithm (RAZGD) |
| Open Source Code | No | The paper does not include an unambiguous statement that the authors are releasing the code for the work described, nor does it provide a direct link to a source-code repository. |
| Open Datasets | Yes | We first consider the simplex constrained least-square problem (Li et al., 2023c) min Ax b 2 2 s.t. x d 1. We test algorithms on both two disease categories and three disease categories, and results are shown in Figure 6. For the unit sphere Sd 1, the tangent space is defined as Tx Sd 1 := {s Rd : Pd j=1 xjsj = 0}. In the experiment, the process of biomarker data generation is consistent with (Das et al., 2022). |
| Dataset Splits | No | The paper does not provide specific dataset split information (exact percentages, sample counts, citations to predefined splits, or detailed splitting methodology) needed to reproduce the data partitioning. |
| Hardware Specification | Yes | All experiments are performed on a computer with a 24-core Intel Core i9-13900HX processor. |
| Software Dependencies | No | The paper does not provide specific ancillary software details, such as library names with version numbers, needed to replicate the experiment. |
| Experiment Setup | Yes | The basic parameters for all three algorithms follow respective theorems. The smoothing parameter µ is set to 0.01 for each algorithm, and notably, we run an additional choice of µ = 0.3 for our algorithm. The initial point is set as the saddle point x0. |