Aligned Objective for Soft-Pseudo-Label Generation in Supervised Learning
Authors: Ning Xu, Yihao Hu, Congyu Qiao, Xin Geng
ICML 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments on the benchmark datasets validate the effectiveness of the proposed framework.4. Experiments |
| Researcher Affiliation | Academia | 1School of Computer Science and Engineering, Southeast University, Nanjing, China 2Key Laboratory of New Generation Artificial Intelligence Technology and Its Interdisciplinary Applications (Southeast University), Ministry of Education, China. E-mail: {xning, yhhu, qiaocy, xgeng}@seu.edu.cn. |
| Pseudocode | Yes | Algorithm 1 SEAL Algorithm |
| Open Source Code | Yes | Source code is available at https://github.com/palm-ml/SEAL |
| Open Datasets | Yes | We employ three benchmark datasets for multi-class classification including CIFAR-10, CIFAR-100 (Krizhevsky et al., 2009), and Tiny Image Net (Le & Yang, 2015), to evaluate the proposed approach. |
| Dataset Splits | Yes | We allocated 10% of the training data from each dataset for validation purposes. |
| Hardware Specification | No | The paper does not provide specific hardware details such as GPU or CPU models used for running the experiments. |
| Software Dependencies | No | The paper mentions software components like SGD optimizer and backbone networks (ResNet), but does not provide specific version numbers for any software dependencies or libraries. |
| Experiment Setup | Yes | We configure the total number of epochs as 200 and set the batch size to 128. We employ the SGD optimizer with a momentum of 0.9 and a weight decay of 1e-4, where the initial learning rate is established at 0.1 with a decay factor of 10%. |