首页|An Accelerated Semi-Proximal ADMM with Applications to Multi-Block Sparse Optimization Problems
An Accelerated Semi-Proximal ADMM with Applications to Multi-Block Sparse Optimization Problems
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Springer Nature
Abstract As an extension of the alternating direction method of multipliers (ADMM), the semi-proximal ADMM (sPADMM) has been widely used in various fields due to its flexibility and robustness. In this paper, we first show that the two-block sPADMM algorithm can achieve an O(1/K)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$O(1/\sqrt{K})$$\end{document} non-ergodic convergence rate. Then we propose an accelerated sPADMM (AsPADMM) algorithm by introducing extrapolation techniques and incrementing penalty parameters. The proposed AsPADMM algorithm is proven to converge globally to an optimal solution with a non-ergodic convergence rate of O(1/K). Furthermore, the AsPADMM can be extended and combined with the symmetric Gauss-Seidel decomposition to achieve an accelerated ADMM for multi-block problems. Finally, we apply the proposed AsPADMM to solving the multi-block subproblems in difference-of-convex algorithms for robust low-rank tensor completion problems and mixed sparse optimization problems. The numerical results suggest that the acceleration techniques bring about a notable improvement in the convergence speed.