Lightweight Differential Privacy Federated Learning Based on Gradient Dropout
To address the privacy issues in the traditional machine learning,federated learning has received widespread attention and research as the first collaborative online learning solution,that does not require users to upload real data but only model up-dates.However,it requires users to train locally and upload model updates that may still contain sensitive information,which rai-ses new privacy concerns.At the same time,the fact that the complete training must be performed locally by the user makes the computational and communication overheads particularly critical.So,there is also an urgent need for a lightweight federated lear-ning architecture.In this paper,a federated learning framework with differential privacy mechanism is used,for further privacy re-quirements.In addition,a Fisher information matrix-based Dropout mechanism,FisherDropout,is proposed for the first time for optimal selection of each dimension in the gradients updated by client-side.This mechanism greatly saves computing cost,commu-nication cost,and privacy budget,and establishes a federated learning framework with both privacy and lightweight advantages.Extensive experiments on real-world datasets demonstrate the effectiveness of the scheme.Experimental results show that the FisherDropout mechanism can save 76.8%~83.6%of communication overhead and 23.0%~26.2%of computational overhead in the best case compared with other federated learning frameworks,and also has outstanding advantages in balancing privacy and usability in differential privacy.
Federated learningDifferential privacyFisher information matrixDropoutLightweight