首页|Anderson Acceleration of Gradient Methods with Energy for Optimization Problems

Anderson Acceleration of Gradient Methods with Energy for Optimization Problems

扫码查看
Anderson acceleration(AA)is an extrapolation technique designed to speed up fixed-point iterations.For optimization problems,we propose a novel algorithm by combining the AA with the energy adaptive gradient method(AEGD)[arXiv:2010.05109].The feasibility of our algorithm is ensured in light of the convergence theory for AEGD,though it is not a fixed-point iteration.We provide rigorous convergence rates of AA for gradient descent(GD)by an acceleration factor of the gain at each implementation of AA-GD.Our experi-mental results show that the proposed AA-AEGD algorithm requires little tuning of hyper-parameters and exhibits superior fast convergence.

Anderson acceleration(AA)Gradient descent(GD)Energy stability

Hailiang Liu、Jia-Hao He、Xuping Tian

展开 >

Department of Mathematics,Iowa State University,Ames,IA,USA

Department of Agricultural and Biosystems Engineering,Iowa State University,Ames,IA,USA

National Science Foundation under

1812666

2024

应用数学与计算数学学报
上海大学

应用数学与计算数学学报

影响因子:0.165
ISSN:1006-6330
年,卷(期):2024.6(2)