Guess-Averse Loss Functions For Cost-Sensitive Multiclass Boosting

Authors: Oscar Beijbom, Mohammad Saberian, David Kriegman, Nuno Vasconcelos

ICML 2014 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Extensive experiments demonstrate (1) the importance of guess-aversion and (2) that the GLL loss function outperforms other loss functions for multiclass boosting.
Researcher Affiliation Academia University of California, San Diego, 9500 Gilman Drive, 92093 La Jolla, CA
Pseudocode Yes Algorithm 1 (GLL, GEL, Ls, Lt)-MCBoost
Open Source Code Yes The MATLAB implementation of the proposed boosting algorithms, along with experimental details is available in supplementary material1.
Open Datasets Yes These experiments used 10 UCI datasets and a large scale computer vision dataset for coral classification (Beijbom et al., 2012).
Dataset Splits Yes For these, training/testing partition are either predefined or the data is randomly split into 80% training and 20% testing.
Hardware Specification No The paper does not provide specific hardware details such as GPU models, CPU types, or memory specifications used for running the experiments.
Software Dependencies No The paper mentions 'MATLAB implementation' and 'LIBLINEAR implementation (Fan et al., 2008)' but does not provide specific version numbers for these software components or any other libraries.
Experiment Setup Yes For each dataset, a random symmetric cost matrix was generated, Cj,k j = k drawn uniformly from [1, 10] R, and all boosted classifiers were trained with 100 iterations. The procedure was repeated 50 times per dataset... The boosting methods were trained with 500 iterations.