SGD Algorithms based on Incomplete U-statistics: Large-Scale Minimization of Empirical Risk

Authors: Guillaume Papa, Stéphan Clémençon, Aurélien Bellet

NeurIPS 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Beyond the rate bound analysis, experiments on AUC maximization and metric learning provide strong empirical evidence of the superiority of the proposed approach. ... In this section, we provide numerical experiments to compare the incomplete and complete U-statistic gradient estimators (5) and (6) in SGD when they rely on the same number of terms B.
Researcher Affiliation Academia Guillaume Papa, St ephan Cl emenc on LTCI, CNRS, T el ecom Paris Tech Universit e Paris-Saclay, 75013 Paris, France ... Aur elien Bellet Magnet Team, INRIA Lille Nord Europe 59650 Villeneuve d Ascq, France
Pseudocode No The paper does not contain any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide explicit concrete access to source code for the methodology described.
Open Datasets Yes The datasets we use are available online.1 (footnote 1 points to http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/)
Dataset Splits No The paper states: 'In all experiments, we randomly split the data into 80% training set and 20% test set'. It does not explicitly mention a validation set split.
Hardware Specification No The paper does not provide any specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers (e.g., library or solver names with versions) needed to replicate the experiment.
Experiment Setup Yes We used a step size of the form γt = γ1/t, and the results below are with respect to the number of SGD iterations. ... We try different values for the initial step size γ1 and the batch size B.