Analysis of Dynamic Federated Learning via Mutual Distillation Technique in Edge-Cloud Collaboration
This paper describes a edge cloud collaborative dynamic federated learning algorithm FedAKD based on mutual distillation.It reduces communication resource consumption by uploading distillation models and adaptively adjusts the mutual distillation of each training round through the SW-UCB algorithm,thereby solving the problem of data heterogeneity and accelerating model training.It validated FedAKD using the CIFAR10 and CIFAR100 datasets under three different data heterogeneity,demonstrating the effectiveness and efficiency of FedAKD.