SCOP: Scientific Control for Reliable Neural Network Pruning

Authors: Yehui Tang, Yunhe Wang, Yixing Xu, Dacheng Tao, Chunjing XU, Chao Xu, Chang Xu

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Through experiments, we demonstrate the superiority of the proposed algorithm over state-of-the-art methods. For example, our method can reduce 57.8% parameters and 60.2% FLOPs of Res Net-101 with only 0.01% top-1 accuracy loss on Image Net. The code is available at https://github.com/huawei-noah/Pruning/tree/master/ SCOP_Neur IPS2020.
Researcher Affiliation Collaboration 1Key Lab of Machine Perception (MOE), Dept. of Machine Intelligence, Peking University. 2Noah s Ark Lab, Huawei Technologies. 3School of Computer Science, Faculty of Engineering, University of Sydney. yhtang@pku.edu.cn, {yunhe.wang, yixing.xu, xuchunjing}@huawei.com, {dacheng.tao,c.xu}@sydney.edu.au, xuchao@cis.pku.edu.cn.
Pseudocode No The paper describes its method verbally and mathematically, but it does not include any explicitly labeled pseudocode or algorithm blocks.
Open Source Code Yes The code is available at https://github.com/huawei-noah/Pruning/tree/master/ SCOP_Neur IPS2020.
Open Datasets Yes CIFAR-10 dataset contains 60K RGB images from 10 classes, 50K images for training and 10K for testing. Imagenet (ILSVRC-2012) is a large-scale dataset containing 1.28M training images and 50K validation images from 1000 classes.
Dataset Splits Yes CIFAR-10 dataset contains 60K RGB images from 10 classes, 50K images for training and 10K for testing. Imagenet (ILSVRC-2012) is a large-scale dataset containing 1.28M training images and 50K validation images from 1000 classes. The single view validation errors of the pruned networks are reported in Table 2.
Hardware Specification Yes The experiments are conducted with Pytorch [28] and Mind Spore 4 on NVIDIA V100 GPUs. calculated by measuring the forward time on a NVIDIA-V100 GPU with a batch size of 128.
Software Dependencies No The paper mentions 'Pytorch' and 'Mind Spore' but does not specify their version numbers, which are necessary for reproducible software dependencies.
Experiment Setup Yes On CIFAR-10 dataset, the learning rate, batchsize and the number of epochs are set to 0.001, 128 and 50, while those on Image Net are 0.004, 1024, and 20. The initial value of scaling factors are set to 0.5. The pruned network is then fine-tuned for 400 epochs on CIFAR-10 and 120 epochs on Image Net, while the initial learning rates are set to 0.04 and 0.2, respectively.