Entropy Induced Pruning Framework for Convolutional Neural Networks

Authors: Yiheng Lu, Ziyu Guan, Yaming Yang, Wei Zhao, Maoguo Gong, Cai Xu

AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We implement our AFIE-based pruning method for three popular CNN models of Alex Net, VGG-16, and Res Net-50, and test them on three widely-used image datasets MNIST, CIFAR-10, and Image Net, respectively. The experimental results are encouraging.
Researcher Affiliation Academia Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi an, China lyhxdu@gmail.com, zyguan@xidian.edu.cn, yym@xidian.edu.cn, ywzhao@mail.xidian.edu.cn, gong@ieee.org, cxu@xidian.edu.cn
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks. The methodology is described using text and equations, and Figure 1 is a diagram, not pseudocode.
Open Source Code No The paper does not provide concrete access to source code for the methodology described. There are no links or explicit statements about code release.
Open Datasets Yes We implement our AFIE-based pruning method for three popular CNN models of Alex Net, VGG-16, and Res Net-50, and test them on three widely-used image datasets MNIST, CIFAR-10, and Image Net, respectively.
Dataset Splits No The paper mentions 'training epochs' and 'test' results, but does not provide specific train/validation/test dataset splits needed to reproduce the experiment. It does not mention a validation set explicitly or how it was partitioned.
Hardware Specification No The paper does not provide specific hardware details (e.g., exact GPU/CPU models, memory amounts) used for running its experiments. It mentions using CNNs but no underlying hardware.
Software Dependencies No The paper does not provide specific ancillary software details, such as library names with version numbers (e.g., Python, PyTorch, TensorFlow versions).
Experiment Setup Yes Specifically, we set the training epochs as 1 and 150 for VGG-16 and Res Net-50, as well as 1 and 20 for Alex Net. ... We set the overall pruning ratio λ to 70%, and specify the specific pruning ratio for each layer according to the Equation (8). ... the overall pruning ratio λ is set as 65%, ... When we set the overall pruning ratio λ to 30%, the pruning ratios of Conv1, Conv2, and Conv3 can be set to 17%, Conv4, Conv5, Conv6, and Conv7 can be set to 29%, Conv8, Conv9, Conv10, Conv11, Conv12, and Conv13 can be set to 51%, and Conv14, Conv15, and Conv16 can be specified as 90% according to Equation (10).