Learning Compact Model for Large-Scale Multi-Label Data
Authors: Tong Wei, Yu-Feng Li5385-5392
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments verify that the proposed method clearly reduces the model size compared to state-of-the-art LMLL approaches, in addition, achieves highly competitive performance. |
| Researcher Affiliation | Academia | Tong Wei, Yu-Feng Li National Key Laboratory for Novel Software Technology, Nanjing University Collaborative Innovation Center of Novel Software Technology and Industrialization Nanjing 210023, China {weit, liyf}@lamda.nju.edu.cn |
| Pseudocode | Yes | Algorithm 1 label Optimization; Algorithm 2 POP |
| Open Source Code | No | The paper provides a link to a data repository, but not to the source code of their proposed method. |
| Open Datasets | Yes | All the data sets as well as the experimental results of state-of-the-art LMLL methods are publicly available, and can be downloaded from the Extreme Classification Repository1. 1http://manikvarma.org/downloads/XC/XMLRepository.html |
| Dataset Splits | No | The paper states, 'We report and compare the results using the same train/test splits of data sets,' but does not provide specific percentages, counts, or explicit instructions for creating these splits, nor does it explicitly mention a validation set. |
| Hardware Specification | Yes | All experimental comparisons are conducted on a same PC machine with an Intel i5-6500 3.20GHz CPU and 32GB RAM. |
| Software Dependencies | No | The paper mentions 'Liblinear (Fan et al. 2008)' for BR and 'default parameter settings in the code' for other methods, but does not provide specific version numbers for any software dependencies. |
| Experiment Setup | Yes | In all of our experiments, we fix the least number of preserved label parameters δ to 5. For LEML, Fast XML, SLEEC, and Co H, we use the default parameter settings in the code. |