Learning with Instance-Dependent Label Noise: A Sample Sieve Approach
Authors: Hao Cheng, Zhaowei Zhu, Xingyu Li, Yifei Gong, Xing Sun, Yang Liu
ICLR 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We demonstrate the performance of CORES2 on CIFAR10 and CIFAR100 datasets with synthetic instance-dependent label noise and Clothing1M with real-world human noise. ... CORES2 achieves competitive performance on multiple datasets, including CIFAR-10, CIFAR-100, and Clothing1M, under different label noise settings. |
| Researcher Affiliation | Collaboration | University of California, Santa Cruz, Tencent You Tu Lab {zwzhu,xli279,yangliu}@ucsc.edu, {louischeng,yifeigong,winfredsun}@tencent.com |
| Pseudocode | Yes | Algorithm 1 Instance-Dependent Label Noise Generation |
| Open Source Code | Yes | Code is available at https://github.com/UCSC-REAL/cores. |
| Open Datasets | Yes | Datasets: CORES2 is evaluated on three benchmark datasets: CIFAR-10, CIFAR-100 (Krizhevsky et al., 2009) and Clothing1M (Xiao et al., 2015). |
| Dataset Splits | Yes | Standard data augmentation is applied to each dataset. ... We use Res Net34 for CIFAR-10 and CIFAR-100 and Res Net50 for Clothing1M. |
| Hardware Specification | No | Only mentions using Res Net34 for CIFAR-10 and CIFAR-100 and Res Net50 for Clothing1M, which are model architectures, not hardware specifications for running experiments. No details on GPUs, CPUs, or memory were provided. |
| Software Dependencies | No | The paper mentions optimizer (SGD) and loss functions (CE, KL-divergence) but does not provide specific software library names with version numbers (e.g., PyTorch 1.9, TensorFlow 2.x). |
| Experiment Setup | Yes | The basic hyper-parameters settings for CIFAR-10 and CIFAR-100 are listed as follows: mini-batch size (64), optimizer (SGD), initial learning rate (0.1), momentum (0.9), weight decay (0.0005), number of epochs (100) and learning rate decay (0.1 at 50 epochs). |