Dirichlet-based Per-Sample Weighting by Transition Matrix for Noisy Label Learning

Authors: HeeSun Bae, Seungjae Shin, Byeonghu Na, Il-chul Moon

ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Empirically, RENT consistently outperforms existing transition matrix utilization methods, which includes reweighting, on various benchmark datasets. Our code is available at https://github.com/Bae Hee Sun/RENT.
Researcher Affiliation Collaboration Hee Sun Bae1, Seungjae Shin1, Byeonghu Na1 & Il-Chul Moon1,2 1Department of Industrial and Systems Engineering, KAIST, 2summary.ai
Pseudocode Yes Algorithm 1: REsampling utilizing the Noise Transition matrix (RENT)
Open Source Code Yes Our code is available at https://github.com/Bae Hee Sun/RENT.
Open Datasets Yes We evaluate our method, RENT, on CIFAR10 and CIFAR100 (Krizhevsky & Hinton, 2009) with synthetic label noise and two real-world noisy dataset, CIFAR10N (Wei et al., 2022) and Clothing1M (Xiao et al., 2015).
Dataset Splits No We trained total 200 epochs for all benchmark datasets and no validation dataset was utilized for early stopping.
Hardware Specification No The paper does not provide specific hardware details (e.g., CPU/GPU models, memory) used for running experiments.
Software Dependencies No The paper mentions 'Adam Optimizer' and 'Res Net' but does not specify version numbers for any key software libraries or frameworks (e.g., PyTorch, TensorFlow, scikit-learn).
Experiment Setup Yes We used Adam Optimizer (Kingma & Ba, 2014) with learning rate 0.001 for training. No learning rate decay was applied. We trained total 200 epochs for all benchmark datasets... We utilized batch size of 128 and as augmentation, Horizontal Flip and Random Crop were applied.