Polynomial Preconditioning for Gradient Methods
Authors: Nikita Doikov, Anton Rodomanov
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical experiments confirm the efficiency of our preconditioning strategies for solving various machine learning problems. |
| Researcher Affiliation | Academia | Nikita Doikov 1 Anton Rodomanov 2 1EPFL, Switzerland 2UCLouvain, Belgium. Correspondence to: Nikita Doikov <nikita.doikov@epfl.ch>, Anton Rodomanov <anton.rodomanov@uclouvain.be>. |
| Pseudocode | Yes | Algorithm 1 Preconditioned Basic Gradient Method |
| Open Source Code | No | No explicit statement or link providing access to the source code for the methodology described in the paper was found. |
| Open Datasets | Yes | Figure 2: Leading eigenvalues (in the logarithmic scale) of the curvature matrix B, for several typical datasets2. 2https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/ |
| Dataset Splits | No | The paper does not provide specific details on training, validation, or test dataset splits. |
| Hardware Specification | Yes | Clock time was evaluated using the machine with Intel Core i5 CPU, 1.6GHz; 8 GB RAM. All methods were implemented in Python. |
| Software Dependencies | No | The paper states 'All methods were implemented in Python.' but does not provide specific version numbers for Python or any libraries used. |
| Experiment Setup | No | The paper discusses parameters and adaptive search procedures but does not provide specific numerical values for hyperparameters or other concrete training configurations for the experiments shown. |