Bailin Li, Bowen Wu, Jiang Su, Guangrun Wang, Liang Lin , EagleEye: Fast Sub-net Evaluation for Efficient Neural Network Pruning, ECCV, 2020, Oral. Paper
Finding out the computational redundant part of a trained Deep Neural Network (DNN) is the key question that pruning algorithms target on. Many algorithms try to predict model performance of the pruned sub-nets by introducing various evaluation methods. But they are either inaccurate or very complicated for general application. In this work, we present a pruning method called EagleEye, in which a simple yet efficient evaluation component based on adaptive batch normalization is applied to unveil a strong correlation between different pruned DNN structures and their final settled accuracy. This strong correlation allows us to fast spot the pruned candidates with highest potential accuracy without actually fine-tuning them. This module is also general to plug-in and improve some existing pruning algorithms. EagleEye achieves better pruning performance than all of the studied pruning algorithms in our experiments. Concretely, to prune MobileNet V1 and ResNet-50, EagleEye outperforms all compared methods by up to 3.8%. Even in the more challenging experiments of pruning the compact model of MobileNet V1, EagleEye achieves the highest accuracy of 70.9% with an overall 50% operations (FLOPs) pruned. All accuracy results are Top-1 ImageNet classification accuracy. Source code and models are accessible to open-source community.
- We point out the reason that a so-called vanilla evaluation step (explained in Section 3.1) widely found in many existing pruning methods leads to poor pruning results. To quantitatively demonstrate the issue, we are the first to introduce a correlation analysis to the domain of pruning algorithm.
- We adopt the technique of adaptive batch normalization for pruning purposes in this work to address the issue in the vanilla evaluation step. It is one of the modules in our proposed pruning algorithm called EagleEye. Our proposed algorithm can effectively estimate the converged accuracy for any pruned model in the time of only a few iterations of inference. It is also general enough to plug-in and improve some existing methods for performance improvement.
- Our experiments show that although EagleEye is simple, it achieves the state-of-the-art pruning performance in comparisons with many more complex approaches. In the ResNet-50 experiments, EagleEye delivers 1.3% to 3.8% higher accuracy than compared algorithms. Even in the challenging task of pruning the compact model of MobileNet V1, EagleEye achieves the highest accuracy of 70.9% with an overall 50% operations (FLOPs) pruned. The results here are ImageNet top-1 classification accuracy.
We presented EagleEye pruning algorithm, in which a fast and accurate evaluation process based on adaptive batch normalization is proposed. Our experiments show the efficiency and effectiveness of our proposed method by delivering higher accuracy than the studied methods in the pruning experiments on ImageNet dataset. An interesting work is to further explore the generality of the adaptive-BN-based module by integrating it into many other existing methods and observe the potential improvement. Another experiment that is worth a try is to replace the random generation of pruning strategy with more advanced methods such as evolutionary algorithms and so on.