An Edge Federated Learning Algorithm Based on Knowledge Distillation
In view of the clients' limited data resources involved in federated learning in edge computing environment,and the problem that it was difficult to further improve the accuracy of edge federated learn-ing algorithm which used hard label knowledge to train the model,an edge federated learning algorithm based on knowledge distillation was proposed. The extraction of soft label information by knowledge distil-lation could effectively improve the performance of the model,so the knowledge distillation technology was introduced into the model training of federated learning. In each round of federated learning model training process,the client uploaded the model parameters and samples logic values to the edge server,and the server generated the global model and global soft label together and sent them to the client for the next round of learning,so that the client could also get the guidance of global soft label knowledge during local training. At the same time,a dynamic adjustment mechanism was designed for the proportion of soft label knowledge and hard label knowledge in model training,so that the knowledge of both could be rea-sonably used to guide model training in federated learning. The experimental results verified that the pro-posed edge federated learning algorithm based on knowledge distillation could effectively improve the accuracy of the model.