TY - GEN
T1 - Improve Convolutional Neural Network Pruning by Maximizing Filter Variety
AU - Hubens, Nathan
AU - Mancas, Matei
AU - Gosselin, Bernard
AU - Preda, Marius
AU - Zaharia, Titus
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2022/1/1
Y1 - 2022/1/1
N2 - Neural network pruning is a widely used strategy for reducing model storage and computing requirements. It allows to lower the complexity of the network by introducing sparsity in the weights. Because taking advantage of sparse matrices is still challenging, pruning is often performed in a structured way, i.e. removing entire convolution filters in the case of ConvNets, according to a chosen pruning criteria. Common pruning criteria, such as l1 -norm or movement, usually do not consider the individual utility of filters, which may lead to: (1) the removal of filters exhibiting rare, thus important and discriminative behaviour, and (2) the retaining of filters with redundant information. In this paper, we present a technique solving those two issues, and which can be appended to any pruning criteria. This technique ensures that the criteria of selection focuses on redundant filters, while retaining the rare ones, thus maximizing the variety of remaining filters. The experimental results, carried out on different datasets (CIFAR-10, CIFAR-100 and CALTECH-101) and using different architectures (VGG-16 and ResNet-18) demonstrate that it is possible to achieve similar sparsity levels while maintaining a higher performance when appending our filter selection technique to pruning criteria. Moreover, we assess the quality of the found sparse subnetworks by applying the Lottery Ticket Hypothesis and find that the addition of our method allows to discover better performing tickets in most cases.
AB - Neural network pruning is a widely used strategy for reducing model storage and computing requirements. It allows to lower the complexity of the network by introducing sparsity in the weights. Because taking advantage of sparse matrices is still challenging, pruning is often performed in a structured way, i.e. removing entire convolution filters in the case of ConvNets, according to a chosen pruning criteria. Common pruning criteria, such as l1 -norm or movement, usually do not consider the individual utility of filters, which may lead to: (1) the removal of filters exhibiting rare, thus important and discriminative behaviour, and (2) the retaining of filters with redundant information. In this paper, we present a technique solving those two issues, and which can be appended to any pruning criteria. This technique ensures that the criteria of selection focuses on redundant filters, while retaining the rare ones, thus maximizing the variety of remaining filters. The experimental results, carried out on different datasets (CIFAR-10, CIFAR-100 and CALTECH-101) and using different architectures (VGG-16 and ResNet-18) demonstrate that it is possible to achieve similar sparsity levels while maintaining a higher performance when appending our filter selection technique to pruning criteria. Moreover, we assess the quality of the found sparse subnetworks by applying the Lottery Ticket Hypothesis and find that the addition of our method allows to discover better performing tickets in most cases.
KW - Neural network interpretation
KW - Neural network pruning
U2 - 10.1007/978-3-031-06427-2_32
DO - 10.1007/978-3-031-06427-2_32
M3 - Conference contribution
AN - SCOPUS:85130922863
SN - 9783031064265
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 379
EP - 390
BT - Image Analysis and Processing – ICIAP 2022 - 21st International Conference, 2022, Proceedings
A2 - Sclaroff, Stan
A2 - Distante, Cosimo
A2 - Leo, Marco
A2 - Farinella, Giovanni M.
A2 - Tombari, Federico
PB - Springer Science and Business Media Deutschland GmbH
T2 - 21st International Conference on Image Analysis and Processing, ICIAP 2022
Y2 - 23 May 2022 through 27 May 2022
ER -