首页|Sparse CapsNet with explicit regularizer

Sparse CapsNet with explicit regularizer

扫码查看
Capsule Network (CapsNet) achieves great improvements in recognizing pose and deformation through a novel encoding mode. However, it carries a large number of parameters, leading to the challenge of heavy memory and computational cost. To solve this problem, we propose sparse CapsNet with an explicit reg-ularizer in this paper. To our knowledge, it's the first work that utilizes sparse optimization to com-press CapsNet. Specifically, to reduce unnecessary weight parameters, we first introduce the component-wise absolute value regularizer into the objective function of CapsNet based on zero-means Laplacian prior. Then, to reduce the computational cost and speed up CapsNet, the weight parameters are fur-ther grouped by 2D filters and sparsified by 1-norm regularization. To train our model efficiently, a new stochastic proximal gradient algorithm, which has analytical solutions at each iteration, is presented. Ex-tensive numerical experiments on four commonly used datasets validate the effectiveness and efficiency of the proposed method. (c) 2021 Elsevier Ltd. All rights reserved.

Capsule networkModel compressionSparse regularizationProximal gradient descentNEURAL-NETWORKSDROPOUTMODEL

Shi, Ruiyang、Niu, Lingfeng、Zhou, Ruizhi

展开 >

Univ Chinese Acad Sci

South China Normal Univ

2022

Pattern Recognition

Pattern Recognition

EISCI
ISSN:0031-3203
年,卷(期):2022.124
  • 4
  • 41