To solve stochastic composite optimization problems in machine learning,we propose a new accelerated proximal variance reduction gradient algorithm called Acc-Prox-SVRG-Polyak,which combines the Acc-Prox-SVRG algorithm with the Polyak step size method.Compared to the existing algorithms,the new algorithm can make full use of the advantages of acceleration technology and Polyak step size to improve its accuracy,the convergence of the algorithm is demonstrated under the usual assumptions,and the complexity is analyzed.Finally,numerical experiments on standard data sets verify the effectiveness of the new algorithm.
关键词
Polyak步长/方差缩减/机器学习/随机梯度
Key words
Polyak step size/variance reduction/machine learning/stochastic gra-dient