An accelerated proximal stochastic gradient method with variance reduction based on Polyak step size
To solve stochastic composite optimization problems in machine learning,we propose a new accelerated proximal variance reduction gradient algorithm called Acc-Prox-SVRG-Polyak,which combines the Acc-Prox-SVRG algorithm with the Polyak step size method.Compared to the existing algorithms,the new algorithm can make full use of the advantages of acceleration technology and Polyak step size to improve its accuracy,the convergence of the algorithm is demonstrated under the usual assumptions,and the complexity is analyzed.Finally,numerical experiments on standard data sets verify the effectiveness of the new algorithm.
Polyak step sizevariance reductionmachine learningstochastic gra-dient