Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

New Insight into Hybrid Stochastic Gradient Descent: Beyond With-Replacement Sampling and Convexity

Authors: Pan Zhou, Xiaotong Yuan, Jiashi Feng

NeurIPS 2018 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive numerical results confirm our theoretical affirmation and demonstrate the favorable efficiency of Wo RS-based HSGD.
Researcher Affiliation Academia Learning & Vision Lab, National University of Singapore, Singapore B-DAT Lab, Nanjing University of Information Science & Technology, Nanjing, China EMAIL EMAIL EMAIL
Pseudocode Yes Algorithm 1 Hybrid SGD under Wo RS
Open Source Code No The paper does not provide any statement about releasing source code or links to a code repository.
Open Datasets Yes All the datasets are public datasets from LibSVM, which can be downloaded from https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/
Dataset Splits No The paper does not explicitly specify exact percentages or sample counts for training, validation, and test splits, nor does it refer to predefined splits with citations for reproducibility.
Hardware Specification No The paper does not specify any hardware used for running the experiments, such as CPU or GPU models, or cloud computing environments with specifications.
Software Dependencies No The paper does not provide specific software dependencies with version numbers (e.g., libraries, frameworks, programming language versions) used for the experiments.
Experiment Setup No While the paper mentions 'Hyper-parameters of all the algorithms are tuned to best' and discusses learning rates and mini-batch size strategies in theory, it does not explicitly provide the specific hyperparameter values or detailed training configurations used in the experiments.