Capsule Network (CapsNet) achieves great improvements in recognizing pose and deformation through a novel encoding mode. However, it carries a large number of parameters, leading to the challenge of heavy memory and computational cost. To solve this problem, we propose sparse CapsNet with an explicit reg-ularizer in this paper. To our knowledge, it's the first work that utilizes sparse optimization to com-press CapsNet. Specifically, to reduce unnecessary weight parameters, we first introduce the component-wise absolute value regularizer into the objective function of CapsNet based on zero-means Laplacian prior. Then, to reduce the computational cost and speed up CapsNet, the weight parameters are fur-ther grouped by 2D filters and sparsified by 1-norm regularization. To train our model efficiently, a new stochastic proximal gradient algorithm, which has analytical solutions at each iteration, is presented. Ex-tensive numerical experiments on four commonly used datasets validate the effectiveness and efficiency of the proposed method. (c) 2021 Elsevier Ltd. All rights reserved.