首页|基于互蒸馏技术的边云协同动态联邦学习模型分析

基于互蒸馏技术的边云协同动态联邦学习模型分析

扫码查看
阐述一种基于互蒸馏的边云协同动态联邦学习算法FedAKD.通过上传蒸馏模型减少通信资源消耗,并通过SW-UCB算法自适应调节每轮训练的互蒸馏,从而解决数据异构问题并加快模型的训练.使用CIFAR10和CIFAR100数据集在三种不同数据异构下对FedAKD进行验证,证明了FedAKD的有效性和高效性.
Analysis of Dynamic Federated Learning via Mutual Distillation Technique in Edge-Cloud Collaboration
This paper describes a edge cloud collaborative dynamic federated learning algorithm FedAKD based on mutual distillation.It reduces communication resource consumption by uploading distillation models and adaptively adjusts the mutual distillation of each training round through the SW-UCB algorithm,thereby solving the problem of data heterogeneity and accelerating model training.It validated FedAKD using the CIFAR10 and CIFAR100 datasets under three different data heterogeneity,demonstrating the effectiveness and efficiency of FedAKD.

edge computingfederated learningmutual knowledge distillationheterogeneous dataconstrained resource

陈乾、徐宏力

展开 >

中国科学技术大学计算机科学与技术学院,安徽 230026

中国科学技术大学苏州高等研究院,江苏 235123

边缘计算 联邦学习 互蒸馏 数据异构 资源受限

国家重点研发计划

2021YFB3301500

2024

电子技术
上海市电子学会,上海市通信学会

电子技术

影响因子:0.296
ISSN:1000-0755
年,卷(期):2024.53(3)
  • 5