Secure and Efficient Federated Learning for Multi-domain Data Scenarios
To tackle the challenges of poor generalization,catastrophic forgetting and privacy attacks that federated learning faces in multi-domain data training,a scheme for secure and efficient federated learning for multi-domain scenarios(SEFL-MDS)is proposed.In the local training phase,knowledge distillation technology is employed to prevent catastrophic forgetting during multi-domain data training,while accelerating knowledge transfer across domains to improve training efficiency.In the uploading phase,Gaussian noise is added to locally updated gradients and generalization differences across domains using the Gaussian differential privacy mechanism to ensure secure data uploads and enhance the confidentiality of the training process.In the aggregation phase,a dynamic generalization-weighted algorithm is utilized to reduce generalization differences across domains,thereby enhancing the generalization capability.Theoretical analysis demonstrates the high robustness of the proposed scheme.Experiments on PACS and office-Home datasets show that the proposed scheme achieves higher accuracy with reduced training time.