Multi-Label Classification with Feature-Aware Non-Linear Label Space Transformation
Authors: Xin Li, Yuhong Guo
IJCAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct experiments on a number of multi-label classification datasets. The proposed approach demonstrates good performance, comparing to a number of stateof-the-art label dimension reduction methods. |
| Researcher Affiliation | Academia | Xin Li and Yuhong Guo Department of Computer and Information Sciences Temple University, Philadelphia, PA 19122, USA {xinli, yuhong}@temple.edu |
| Pseudocode | Yes | Algorithm 1 A Unified Training Algorithm |
| Open Source Code | No | No statement providing access to open-source code was found. |
| Open Datasets | Yes | We used five real world multi-label datasets for image and text categorization tasks in our experiments, including Corel5K, ESPGame, Iaprtc12, Enron, and Delicious. |
| Dataset Splits | Yes | We conducted experiments using 10-fold cross validation on four datasets, except the large scale dataset Delicious, on which we conducted experiments using 5-fold cross validation. In each cross validation iteration, we performed parameter selection for all the comparison methods by using 80% of the training set for training and the remaining 20% for performance evaluation. |
| Hardware Specification | Yes | To compare the empirical computational complexity of the comparison methods, we reported in Table 2 the training time and testing time of each method for a single run with θ=0.3 on a 64-bit PC with 4 processors (3.4 GHz) and 16 GB memory. |
| Software Dependencies | No | No specific software dependencies with version numbers were mentioned. |
| Experiment Setup | Yes | For the proposed COMB method, there are two parameters µ and γ to be tuned for the decoding process. We selected the µ value from the set [0.001, 0.005, 0.01, 0.05, 0.1], and selected the γ value from the set [0, 0.2, 0.4, 0.6, 0.8, 1]. |