Neural Network Pruning by Cooperative Coevolution
Authors: Haopu Shang, Jia-Liang Wu, Wenjing Hong, Chao Qian
IJCAI 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The experiments show that CCEP can achieve a competitive performance with the stateof-the-art pruning methods, e.g., prune Res Net56 for 63.42% FLOPs on CIFAR10 with 0.24% accuracy drop, and Res Net50 for 44.56% FLOPs on Image Net with 0.07% accuracy drop. |
| Researcher Affiliation | Academia | 1State Key Laboratory for Novel Software Technology, Nanjing University, Nanjing 210023, China 2Department of Computer Science and Engineering Southern University of Science and Technology, Shenzhen 518055, China |
| Pseudocode | Yes | Algorithm 1 CCEP framework |
| Open Source Code | No | Supplementary materials are available at https://arxiv.org/abs/2204.05639. |
| Open Datasets | Yes | Two classic data sets CIFAR10 [Krizhevsky and Hinton, 2009] and Image Net [Russakovsky et al., 2015] for image classification are used for the examination. |
| Dataset Splits | Yes | Image Net contains 1.28M images in the training set and 50K in the validation set, for 1K classes. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used for experiments (e.g., GPU models, CPU types). |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers. |
| Experiment Setup | Yes | The settings of CCEP are described as follows. It runs for 12 iterations, i.e., T = 12 in Algorithm 1. For the EA (i.e., Algorithm 2) in each group, the population size m is 5, the mutation rates p1 = 0.05 and p2 = 0.1, the ratio bound r is 0.1, the maximum generation G is 10, and 20% of the training set is used for accuracy evaluation. |