首页|Anderson Acceleration of Gradient Methods with Energy for Optimization Problems
Anderson Acceleration of Gradient Methods with Energy for Optimization Problems
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
万方数据
Anderson acceleration(AA)is an extrapolation technique designed to speed up fixed-point iterations.For optimization problems,we propose a novel algorithm by combining the AA with the energy adaptive gradient method(AEGD)[arXiv:2010.05109].The feasibility of our algorithm is ensured in light of the convergence theory for AEGD,though it is not a fixed-point iteration.We provide rigorous convergence rates of AA for gradient descent(GD)by an acceleration factor of the gain at each implementation of AA-GD.Our experi-mental results show that the proposed AA-AEGD algorithm requires little tuning of hyper-parameters and exhibits superior fast convergence.
Anderson acceleration(AA)Gradient descent(GD)Energy stability
Hailiang Liu、Jia-Hao He、Xuping Tian
展开 >
Department of Mathematics,Iowa State University,Ames,IA,USA
Department of Agricultural and Biosystems Engineering,Iowa State University,Ames,IA,USA