首页|一类自适应梯度裁剪的差分隐私随机梯度下降算法

一类自适应梯度裁剪的差分隐私随机梯度下降算法

扫码查看
梯度裁剪是一种防止梯度爆炸的有效方法,但梯度裁剪参数的选取通常对训练模型的性能有较大的影响.为此,本文针对标准的差分隐私随机梯度下降算法进行改进.首先,提出一种自适应的梯度裁剪方法,即在传统裁剪方法基础上利用分位数和指数平均策略对梯度裁剪参数进行自适应动态调整,进而提出一类自适应梯度裁剪的差分隐私随机梯度下降算法.其次,在非凸目标函数的情况下对提出的自适应算法给出收敛性分析和隐私性分析.最后,在MNIST、Fasion-MNIST和IMDB数据集上进行数值仿真.其结果表明,与传统梯度裁剪算法相比,本文提出的自适应梯度裁剪算法显著提高了模型精度.
A class of differential privacy stochastic gradient descent algorithm with adaptive gradient clipping
Gradient clipping is an effective method to prevent gradient explosion,but the selection of the gradient clipping parameter usually has a great influence on the performance of training models.To address this issue,this paper proposes an improved differentially private stochastic gradient descent algorithm by adaptively adjusting the gradient clipping parameter.First,an adaptive gradient clipping method is proposed by using the quantile and exponential averaging strategy to dynamically and adaptively adjust the gradient clipping parameter.Second,the convergence and privacy of the proposed algorithm for the case of non-convex objective function are analyzed.Finally,numerical simulations are performed on MNIST,Fasion-MNIST and IMDB datasets.The results show that the proposed algorithm can significantly improve the model accuracy compared to traditional stochastic gradient descent methods.

stochastic gradient descent algorithmdifferential privacygradient clip-pingadaptivity

张家棋、李觉友

展开 >

重庆大学数学与统计学院,重庆 400044

重庆师范大学数学科学学院,重庆 401331

随机梯度下降算法 差分隐私 梯度裁剪 自适应性

国家重点研发计划项目国家自然科学基金国家自然科学基金重庆市自然科学基金

2023YFA10113031197108311991024cstc2020jcyjmsxmX0287

2024

运筹学学报
中国运筹学会

运筹学学报

CSTPCD北大核心
影响因子:0.25
ISSN:1007-6093
年,卷(期):2024.28(2)