Feature Statistics Guided Efficient Filter Pruning
Authors: Hang Li, Chen Ma, Wei Xu, Xue Liu
IJCAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We conduct extensive empirical experiments with various CNN architectures on publicly available datasets. The experimental results demonstrate that our model obtains up to 91.6% parameter decrease and 83.7% FLOPs reduction with almost no accuracy loss. |
| Researcher Affiliation | Academia | 1School of Computer Science, Mc Gill University 2Institute for Interdisciplinary Information Sciences, Tsinghua University |
| Pseudocode | Yes | Algorithm 1 Our proposed filter pruning scheme; Algorithm 2 Similarity-aware feature map selection (SFS) |
| Open Source Code | No | The paper does not contain any explicit statement about releasing source code or provide a link to a code repository. |
| Open Datasets | Yes | CIFAR10 and CIFAR100 [Krizhevsky et al., 2009] are two widely used datasets with 32 × 32 colour natural images. They both contain 50, 000 training images and 10, 000 test images with 10 and 100 classes respectively. ... ILSVRC-2012 is a large-scale dataset with 1.2 million training images and 50, 000 validation images of 1000 classes. |
| Dataset Splits | Yes | For ILSVRC-2012, we use the pre-trained Res Net-50 released by Pytorch. We train Mobile Net for 60 epochs with a weight decay of 0.0015. ... ILSVRC-2012 is a large-scale dataset with 1.2 million training images and 50, 000 validation images of 1000 classes. |
| Hardware Specification | No | The paper does not provide specific details about the hardware used, such as GPU/CPU models or memory specifications. |
| Software Dependencies | No | The paper mentions using 'Pytorch' but does not specify its version number or any other software dependencies with version details. |
| Experiment Setup | Yes | For CIFAR, we set the mini-batch size to 64, epochs to 160 with a weight decay of 0.0015 and Nesterov momentum [Sutskever et al., 2013] of 0.9. For ILSVRC-2012, we use the pre-trained Res Net-50 released by Pytorch. We train Mobile Net for 60 epochs with a weight decay of 0.0015. |