Neural Networks2022,Vol.1499.DOI:10.1016/j.neunet.2022.02.012

Sign Stochastic Gradient Descents without bounded gradient assumption for the finite sum minimization

Sun, Tao Li, Dongsheng
Neural Networks2022,Vol.1499.DOI:10.1016/j.neunet.2022.02.012

Sign Stochastic Gradient Descents without bounded gradient assumption for the finite sum minimization

Sun, Tao 1Li, Dongsheng1
扫码查看

作者信息

  • 1. Coll Comp,Natl Univ Def Technol
  • 折叠

Abstract

Sign-based Stochastic Gradient Descents (Sign-based SGDs) use the signs of the stochastic gradients for communication costs reduction. Nevertheless, current convergence results of sign-based SGDs applied to the finite sum optimization are established on the bounded assumption of the gradient, which fails to hold in various cases. This paper presents a convergence framework about sign-based SGDs with the elimination of the bounded gradient assumption. The ergodic convergence rates are provided only with the smooth assumption of the objective functions. The Sign Stochastic Gradient Descent (siGNSGD) and its two variants, including majority vote and zeroth-order version, are developed for different application settings. Our framework also removes the bounded gradient assumption used in the previous analysis of these three algorithms. (C) 2022 Elsevier Ltd. All rights reserved.

Key words

Sign Stochastic Gradient Descent/Convergence/Unbounded gradient/Zeroth-order/Majority vote

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量6
参考文献量45
段落导航相关论文