首页|Differentially private SGD with random features

Differentially private SGD with random features

扫码查看
In the realm of large-scale machine learning,it is crucial to explore methods for reducing computational complexity and memory demands while maintaining generalization per-formance.Additionally,since the collected data may contain some sensitive information,it is also of great significance to study privacy-preserving machine learning algorithms.This paper focuses on the performance of the differentially private stochastic gradient descent(SGD)algo-rithm based on random features.To begin,the algorithm maps the original data into a low-dimensional space,thereby avoiding the traditional kernel method for large-scale data storage requirement.Subsequently,the algorithm iteratively optimizes parameters using the stochastic gradient descent approach.Lastly,the output perturbation mechanism is employed to introduce random noise,ensuring algorithmic privacy.We prove that the proposed algorithm satisfies the differential privacy while achieving fast convergence rates under some mild conditions.

learning theorydifferential privacystochastic gradient descentrandom featuresreproducing kernel Hilbert spaces

WANG Yi-guang、GUO Zheng-chu

展开 >

Polytechnic Institute of Zhejiang University,Zhejiang University,Hangzhou 310015,China

School of Mathematical Sciences,Zhejiang University,Hangzhou 310058,China

Zhejiang Provincial Natural Science Foundation of ChinaNational Natural Science Foundation of ChinaNational Natural Science Foundation of China

LR20A01000112271473U21A20426

2024

高校应用数学学报B辑(英文版)
浙江大学 中国工业与应用数学学会

高校应用数学学报B辑(英文版)

影响因子:0.146
ISSN:1005-1031
年,卷(期):2024.39(1)
  • 32