Neural Networks2022,Vol.14512.DOI:10.1016/j.neunet.2021.10.018

Sparsity-control ternary weight networks

Deng X. Zhang Z.
Neural Networks2022,Vol.14512.DOI:10.1016/j.neunet.2021.10.018

Sparsity-control ternary weight networks

Deng X. 1Zhang Z.1
扫码查看

作者信息

  • 1. State University of New York at Binghamton
  • 折叠

Abstract

? 2021 Elsevier LtdDeep neural networks (DNNs) have been widely and successfully applied to various applications, but they require large amounts of memory and computational power. This severely restricts their deployment on resource-limited devices. To address this issue, many efforts have been made on training low-bit weight DNNs. In this paper, we focus on training ternary weight {?1, 0, +1} networks which can avoid multiplications and dramatically reduce the memory and computation requirements. A ternary weight network can be considered as a sparser version of the binary weight counterpart by replacing some ?1s or 1s in the binary weights with 0s, thus leading to more efficient inference but more memory cost. However, the existing approaches to train ternary weight networks cannot control the sparsity (i.e., percentage of 0s) of the ternary weights, which undermines the advantage of ternary weights. In this paper, we propose to our best knowledge the first sparsity-control approach (SCA) to train ternary weight networks, which is simply achieved by a weight discretization regularizer (WDR). SCA is different from all the existing regularizer-based approaches in that it can control the sparsity of the ternary weights through a controller α and does not rely on gradient estimators. We theoretically and empirically show that the sparsity of the trained ternary weights is positively related to α. SCA is extremely simple, easy-to-implement, and is shown to consistently outperform the state-of-the-art approaches significantly over several benchmark datasets and even matches the performances of the full-precision weight counterparts.

Key words

Image classification/Model compression/Sparsity control/Ternary weight networks

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量2
参考文献量60
段落导航相关论文