Toward Robustness against Label Noise in Training Deep Discriminative Neural Networks

Authors: Arash Vahdat

NeurIPS 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we examine the proposed robust CNN-CRF model for the image labeling problem. and The experimental results are reported in Table 1 under Caption Labels. and Overall, our method achieves slightly better prediction accuracy on the CIFAR-10 dataset than the baselines. And, in terms of recovering clean labels on the noisy training set, our model significantly outperforms the baselines. Examples of the recovered clean labels are visualized for the CIFAR-10 experiment in the supplementary material.
Researcher Affiliation Industry Arash Vahdat D-Wave Systems Inc. Burnaby, BC, Canada avahdat@dwavesys.com
Pseudocode Yes Algorithm 1: Train robust CNN-CRF with simple gradient descent
Open Source Code No The paper does not explicitly state that its source code is publicly available or provide a link to a code repository.
Open Datasets Yes The Microsoft COCO 2014 dataset is one of the largest publicly available datasets that contains both noisy and clean object labels. and We apply our proposed learning framework to the object classification problem in the CIFAR-10 dataset.
Dataset Splits Yes We follow the same 87K/20K/20K train/validation/test split as [4], and use mean average precision (m AP) measure over these 73 object categories as the performance assessment.
Hardware Specification No The paper does not provide specific hardware details such as exact GPU/CPU models or detailed computer specifications used for running its experiments.
Software Dependencies No The paper mentions 'Tensor Flow' but does not provide specific version numbers for this or any other software dependencies.
Experiment Setup Yes The learning rate and epsilon for the optimizer are set to (0.001, 1) and (0.0003, 0.1) respectively in VGG-16 and Res Net-50. We anneal α from 40 to 5 in 11 epochs.