运筹学学报2024,Vol.28Issue(2) :47-57.DOI:10.15960/j.cnki.issn.1007-6093.2024.02.003

一类自适应梯度裁剪的差分隐私随机梯度下降算法

A class of differential privacy stochastic gradient descent algorithm with adaptive gradient clipping

张家棋 李觉友
运筹学学报2024,Vol.28Issue(2) :47-57.DOI:10.15960/j.cnki.issn.1007-6093.2024.02.003

一类自适应梯度裁剪的差分隐私随机梯度下降算法

A class of differential privacy stochastic gradient descent algorithm with adaptive gradient clipping

张家棋 1李觉友2
扫码查看

作者信息

  • 1. 重庆大学数学与统计学院,重庆 400044
  • 2. 重庆师范大学数学科学学院,重庆 401331
  • 折叠

摘要

梯度裁剪是一种防止梯度爆炸的有效方法,但梯度裁剪参数的选取通常对训练模型的性能有较大的影响.为此,本文针对标准的差分隐私随机梯度下降算法进行改进.首先,提出一种自适应的梯度裁剪方法,即在传统裁剪方法基础上利用分位数和指数平均策略对梯度裁剪参数进行自适应动态调整,进而提出一类自适应梯度裁剪的差分隐私随机梯度下降算法.其次,在非凸目标函数的情况下对提出的自适应算法给出收敛性分析和隐私性分析.最后,在MNIST、Fasion-MNIST和IMDB数据集上进行数值仿真.其结果表明,与传统梯度裁剪算法相比,本文提出的自适应梯度裁剪算法显著提高了模型精度.

Abstract

Gradient clipping is an effective method to prevent gradient explosion,but the selection of the gradient clipping parameter usually has a great influence on the performance of training models.To address this issue,this paper proposes an improved differentially private stochastic gradient descent algorithm by adaptively adjusting the gradient clipping parameter.First,an adaptive gradient clipping method is proposed by using the quantile and exponential averaging strategy to dynamically and adaptively adjust the gradient clipping parameter.Second,the convergence and privacy of the proposed algorithm for the case of non-convex objective function are analyzed.Finally,numerical simulations are performed on MNIST,Fasion-MNIST and IMDB datasets.The results show that the proposed algorithm can significantly improve the model accuracy compared to traditional stochastic gradient descent methods.

关键词

随机梯度下降算法/差分隐私/梯度裁剪/自适应性

Key words

stochastic gradient descent algorithm/differential privacy/gradient clip-ping/adaptivity

引用本文复制引用

基金项目

国家重点研发计划项目(2023YFA1011303)

国家自然科学基金(11971083)

国家自然科学基金(11991024)

重庆市自然科学基金(cstc2020jcyjmsxmX0287)

出版年

2024
运筹学学报
中国运筹学会

运筹学学报

CSTPCDCSCD北大核心
影响因子:0.25
ISSN:1007-6093
参考文献量12
段落导航相关论文