运筹学学报2024,Vol.28Issue(2) :131-142.DOI:10.15960/j.cnki.issn.1007-6093.2024.02.010

基于Polyak步长的加速临近随机方差缩减算法

An accelerated proximal stochastic gradient method with variance reduction based on Polyak step size

王福胜 史鲁玉
运筹学学报2024,Vol.28Issue(2) :131-142.DOI:10.15960/j.cnki.issn.1007-6093.2024.02.010

基于Polyak步长的加速临近随机方差缩减算法

An accelerated proximal stochastic gradient method with variance reduction based on Polyak step size

王福胜 1史鲁玉1
扫码查看

作者信息

  • 1. 太原师范学院数学与统计学院,山西晋中 030619
  • 折叠

摘要

针对大规模机器学习中随机复合优化问题,本文将加速临近随机方差缩减算法(Acc-Prox-SVRG)和Polyak步长方法相结合,提出了一种新的加速临近随机方差缩减算法(Acc-Prox-SVRG-Polyak).相比于已有算法,新算法充分利用加速技术和Polyak步长的优越性,提高了准确率.在通常的假定下论证了算法的收敛性,并分析了复杂度.最后,数值实验验证了新算法的有效性.

Abstract

To solve stochastic composite optimization problems in machine learning,we propose a new accelerated proximal variance reduction gradient algorithm called Acc-Prox-SVRG-Polyak,which combines the Acc-Prox-SVRG algorithm with the Polyak step size method.Compared to the existing algorithms,the new algorithm can make full use of the advantages of acceleration technology and Polyak step size to improve its accuracy,the convergence of the algorithm is demonstrated under the usual assumptions,and the complexity is analyzed.Finally,numerical experiments on standard data sets verify the effectiveness of the new algorithm.

关键词

Polyak步长/方差缩减/机器学习/随机梯度

Key words

Polyak step size/variance reduction/machine learning/stochastic gra-dient

引用本文复制引用

基金项目

山西省基础研究计划(自由探索类)基金(202103021224303)

出版年

2024
运筹学学报
中国运筹学会

运筹学学报

CSTPCDCSCD北大核心
影响因子:0.25
ISSN:1007-6093
参考文献量33
段落导航相关论文