Approximate Conditional Gradient Descent on Multi-Class Classification

Authors: Zhuanghua Liu, Ivor Tsang

AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirical results verify that our method outperforms the state-of-the-art stochastic projectionfree methods.
Researcher Affiliation Academia Zhuanghua Liu, Ivor Tsang Centre for Artifical Intelligence University of Technology Sydney liuzhuanghua1991@gmail.com Ivor.Tsang@uts.edu.au
Pseudocode Yes Algorithm 1 Approximate Frank-Wolfe
Open Source Code No The paper does not provide any explicit statement or link for open-source code for the methodology described.
Open Datasets Yes We conducted our experiment on several large-scale datasets from the libsvm website1. The datasets are summarized in the following table: [...] 1https://www.csie.ntu.edu.tw/ cjlin/libsvmtools/datasets/
Dataset Splits No The paper mentions using "training data" and "datasets" for evaluation but does not specify clear train/validation/test splits, percentages, or absolute sample counts for data partitioning. No mention of a "validation" set.
Hardware Specification Yes Our experiments run on a server with 3.1 GHZ CPU and 132 GB memory.
Software Dependencies No Our algorithm and baseline methods are implemented in Matlab. (No version number for Matlab is provided.)
Experiment Setup Yes In the evaluation of approximate Frank Wolfe, we set ν = 50, δ = 10 and k0 = 100. For fair comparison, we use the default parameter for SFW and SVRF from (Hazan and Luo 2016), i.e., The size of mini-batches at round t is t2, t for SFW and SVRF, respectively.