首页|基于Polyak步长的加速临近随机方差缩减算法

基于Polyak步长的加速临近随机方差缩减算法

扫码查看
针对大规模机器学习中随机复合优化问题,本文将加速临近随机方差缩减算法(Acc-Prox-SVRG)和Polyak步长方法相结合,提出了一种新的加速临近随机方差缩减算法(Acc-Prox-SVRG-Polyak).相比于已有算法,新算法充分利用加速技术和Polyak步长的优越性,提高了准确率.在通常的假定下论证了算法的收敛性,并分析了复杂度.最后,数值实验验证了新算法的有效性.
An accelerated proximal stochastic gradient method with variance reduction based on Polyak step size
To solve stochastic composite optimization problems in machine learning,we propose a new accelerated proximal variance reduction gradient algorithm called Acc-Prox-SVRG-Polyak,which combines the Acc-Prox-SVRG algorithm with the Polyak step size method.Compared to the existing algorithms,the new algorithm can make full use of the advantages of acceleration technology and Polyak step size to improve its accuracy,the convergence of the algorithm is demonstrated under the usual assumptions,and the complexity is analyzed.Finally,numerical experiments on standard data sets verify the effectiveness of the new algorithm.

Polyak step sizevariance reductionmachine learningstochastic gra-dient

王福胜、史鲁玉

展开 >

太原师范学院数学与统计学院,山西晋中 030619

Polyak步长 方差缩减 机器学习 随机梯度

山西省基础研究计划(自由探索类)基金

202103021224303

2024

运筹学学报
中国运筹学会

运筹学学报

CSTPCD北大核心
影响因子:0.25
ISSN:1007-6093
年,卷(期):2024.28(2)