Partial Multi-Label Learning with Label Distribution
Authors: Ning Xu, Yun-Peng Liu, Xin Geng6510-6517
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results on synthetic as well as real-world datasets clearly validate the effectiveness of PML-LD for solving PML problems. |
| Researcher Affiliation | Academia | Ning Xu, Yun-Peng Liu, Xin Geng MOE Key Laboratory of Computer Network and Information Integration, China School of Computer Science and Engineering, Southeast University, Nanjing 210096, China {xning, yunpengliu, xgeng}@seu.edu.cn |
| Pseudocode | No | The paper describes methods in textual paragraphs but does not include any explicitly labeled 'Pseudocode' or 'Algorithm' blocks. |
| Open Source Code | No | The paper does not provide any explicit statements about the release of its source code or links to a code repository. |
| Open Datasets | Yes | five benchmark multi-label datasets (Zhang and Zhou 2014) are used to generate synthetic PML datasets, including image, emotions, scene, yeast, and eurlex sm. Furthermore, three real-world PML datasets including music emotion, music style and mirflickr (Huiskes and Lew 2008) are also employed in this paper. |
| Dataset Splits | Yes | On each dataset, five-fold cross-validation is performed where the mean metric value as well as standard deviation are recorded for each comparing approach. |
| Hardware Specification | No | The paper does not provide any specific details about the hardware used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers (e.g., Python, PyTorch, TensorFlow versions, or specific library versions). |
| Experiment Setup | Yes | For PML-LD, the parameter λ1, λ2, m, β1 and β2 are fix to 0.01, 0.01, 20, 1, 10 respectively. The kernel function in PML-LD is Gaussian kernel. |