Stochastic Chebyshev Gradient Descent for Spectral Optimization

Authors: Insu Han, Haim Avron, Jinwoo Shin

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental The utility of our methods is demonstrated in numerical experiments.Our experimental results confirm that the proposed algorithms are significantly faster than other competitors under large-scale real-world instances.
Researcher Affiliation Collaboration 1School of Electrical Engineering, Korea Advanced Institute of Science and Technology 2Department of Applied Mathematics, Tel Aviv University 3AItrics
Pseudocode Yes Algorithm 1 SGD for solving (4) and Algorithm 2 SVRG for solving (4)
Open Source Code No The paper does not provide any explicit statement or link regarding the public availability of its source code.
Open Datasets Yes We use the Movie Lens 1M and 10M datasets [12] ... We benchmark GP regression under natural sound dataset used in [29] and Szeged humid dataset [5]
Dataset Splits No The paper mentions using Movie Lens 1M and 10M datasets, and natural sound dataset and Szeged humid dataset, but it does not provide specific details about the training, validation, and test splits used.
Hardware Specification No The paper does not provide any specific hardware details such as GPU or CPU models, memory specifications, or cloud instance types used for running the experiments.
Software Dependencies No The paper does not explicitly state specific software dependencies along with their version numbers, such as programming languages, libraries, or frameworks used for implementation.
Experiment Setup No The paper describes the general algorithms and objective functions but does not provide specific experimental setup details such as concrete hyperparameter values (e.g., learning rates, batch sizes, epochs), regularization weights, or optimizer settings used in the numerical experiments.