A Data-Free Personalized Federated Learning Algorithm Based on Knowledge Distillation
Federated learning algorithms usually face the problem of huge differences between clients,and these heterogeneities degrade the global model performance,which are mitigated by knowledge distillation approaches.In order to further liberate public data and improve the model performance,DFP-KD trained a robust federated learning global model using datad-free knowledge distillation methods;used ReACGAN as the generator part;and adopted a step-by-step EMA fast updating strategy,which speeded up the update rate of the global model while avoiding catastrophic forgetting.Comparison experiments,ablation experiments,and parameter value influence experiments show that DFP-KD is more advantageous than the classical data-free knowledge distillation algorithms in terms of accuracy,stability,and update rate.