A class of differential privacy stochastic gradient descent algorithm with adaptive gradient clipping
Gradient clipping is an effective method to prevent gradient explosion,but the selection of the gradient clipping parameter usually has a great influence on the performance of training models.To address this issue,this paper proposes an improved differentially private stochastic gradient descent algorithm by adaptively adjusting the gradient clipping parameter.First,an adaptive gradient clipping method is proposed by using the quantile and exponential averaging strategy to dynamically and adaptively adjust the gradient clipping parameter.Second,the convergence and privacy of the proposed algorithm for the case of non-convex objective function are analyzed.Finally,numerical simulations are performed on MNIST,Fasion-MNIST and IMDB datasets.The results show that the proposed algorithm can significantly improve the model accuracy compared to traditional stochastic gradient descent methods.