Application of Parameter Decoupling in Differentially Privacy Protection Federated Learning
Federated learning(FL)is an advanced privacy preserving machine learning technique that exchanges model parameters to train shared models through multi-party collaboration without the need for centralized aggregation of raw data.Although par-ticipants in FL do not need to explicitly share data,many studies show that they still face various privacy inference attacks,lead-ing to privacy information leakage.To address this issue,the academic community has proposed various solutions.One of the strict privacy protection methods is to apply Local differential privacy(LDP)technology to federated learning.This technology adds random noise to the model parameters before they are uploaded by participants,to effectively resist inference attacks from malicious attackers.However,the noise introduced by LDP can reduce the model performance.Meanwhile,the latest research sug-gests that this performance decline is related to the additional heterogeneity introduced by LDP between clients.A parameter de-coupling based federated learning scheme(PD-LDPFL)with differential privacy protection is proposed to address the issue of FL performance degradation caused by LDP.In addition to the basic model issued by the server,each client also learns personalized input and output models locally.This scheme only uploads the parameters of the basic model with added noise during client trans-mission,while the personalized model is retained locally,adaptively changing the input and output distribution of the client's local data to alleviate the additional heterogeneity introduced by LDP and reduce accuracy loss.In addition,research has found that even with a higher privacy budget,this scheme can naturally resist some gradient based privacy inference attacks,such as deep gradient leakage and other attack methods.Through experiments on three commonly used datasets,MNIST,FMNIST,and CI-FAR-10,the results show that this scheme not only achieves better performance compared to traditional differential privacy feder-ated learning,but also provides additional security.