Neural Networks2022,Vol.1468.DOI:10.1016/j.neunet.2021.11.029

LOss-Based SensiTivity rEgulaRization: Towards deep sparse neural networks

Tartaglione E. Bragagnolo A. Fiandrotti A. Grangetto M.
Neural Networks2022,Vol.1468.DOI:10.1016/j.neunet.2021.11.029

LOss-Based SensiTivity rEgulaRization: Towards deep sparse neural networks

Tartaglione E. 1Bragagnolo A. 1Fiandrotti A. 1Grangetto M.1
扫码查看

作者信息

  • 1. Università degli Studi di Torino
  • 折叠

Abstract

? 2021 Elsevier LtdLOBSTER (LOss-Based SensiTivity rEgulaRization) is a method for training neural networks having a sparse topology. Let the sensitivity of a network parameter be the variation of the loss function with respect to the variation of the parameter. Parameters with low sensitivity, i.e. having little impact on the loss when perturbed, are shrunk and then pruned to sparsify the network. Our method allows to train a network from scratch, i.e. without preliminary learning or rewinding. Experiments on multiple architectures and datasets show competitive compression ratios with minimal computational overhead.

Key words

Deep learning/Pruning/Regularization/Sparsity

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量7
参考文献量34
段落导航相关论文