Linear Convergence of a Class of Non-convex Bregman Gradient Algorithm
Gradient descent algorithm is an important method for solving unconstrained optimization prob-lems,and the assumption of smoothness plays a crucial role in its research.Bregman gradient descent al-gorithm is an extension of the gradient descent algorithm,and it can be essentially seen as a natural out-growth when the classical smoothness is reduced to relative smoothness.This paper studies the linear con-vergence of Bregman gradient descent algorithm for solving relatively strongly quasar-convex and relatively smooth problems.It is proved that when the objective function is relatively strongly quasar-convex and rel-atively smooth,the sequence of function value produced by Bregman gradient descent algorithm has a line-ar convergence rate.Meanwhile,the convergence of iterative sequence is also given.