Multi-level Generative Models for Partial Label Learning with Non-random Label Noise

Authors: Yan Yan, Yuhong Guo

IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct extensive experiments on both synthesized and real-world partial label datasets. The proposed approach demonstrates the state-of-the-art performance for partial label learning.
Researcher Affiliation Academia Yan Yan 1 , Yuhong Guo2,3 1Northwestern Polytechnical University, China 2Carleton University, Canada 3Canada CIFAR AI Chair, Amii
Pseudocode No The paper describes an algorithm verbally in Section 3.4, but does not provide any structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide any explicit statements about releasing source code or links to a code repository.
Open Datasets Yes The synthetic datasets are generated from six UCI datasets, ecoli, deter, vehicle, segment, satimage and letter. ... We used five real-world PL datasets that are collected from several application domains, including FG-NET [Panis and Lanitis, 2014] for facial age estimation, Lost [Cour et al., 2011], Yahoo! News [Guillaumin et al., 2010] for automatic face naming in images or videos, MSRCv2 [Dietterich and Bakiri, 1994] for object classification, and Bird Song [Briggs et al., 2012] for bird song classification.
Dataset Splits Yes For each PL dataset, ten-fold cross-validation is performed and the average test accuracy results are recorded. ... For each dataset, ten-fold cross-validation is conducted.
Hardware Specification No The paper does not provide specific details about the hardware used for running the experiments (e.g., GPU models, CPU types).
Software Dependencies No The paper does not list specific software dependencies with version numbers.
Experiment Setup No The paper describes the general architecture and optimization method but does not provide specific hyperparameter values or detailed training configurations (e.g., learning rate, batch size, number of epochs).