软聚类常用于解决多任务联邦学习(federated learning,FL)场景下存在非独立同分布(non-independent and identically distributed,non-IID)数据时的模型精度下降问题。然而,使用软聚类需上传和下载更多的模型参数。为了应对这一挑战,提出了基于局部性原理的联邦学习(federated learning with principle of locality,FedPol)算法。采用近端局部更新机制以确保客户端的本地更新在一定范围内波动;利用客户端本地数据分布的局部特性,整合历史数据分布信息至模型训练过程,加速了模型收敛,减少了需要传输的参数量。仿真实验证明,针对non-IID数据,FedPol算法可以在保持模型精度的前提下比其他算法减少约10%的迭代轮次,有效降低了通信成本。
Efficient soft clustering federated learning based on locality principle
Soft clustering is often used to solve the problem of model accuracy degradation in multi-task federated learning(FL)scenarios in the presence of non-independent and identically distributed(non-IID)data.However,using soft clustering requires uploading and downloading more model parameters.To address this challenge,federated learning with principle of locality(FedPol)algorithm was proposed.The proximal local update mechanism was adopted to ensure that the local update of the client fluctuates within a certain range.By utilizing the local characteristics of the local data distribution of the client,the historical data distribution information was integrated into the model training process,which accelerated the model convergence and reduced the number of parameters that needed to be transmitted.Simulation experiments show that for non-IID data,the FedPol algorithm can reduce the number of iteration rounds by about 10%compared with other algorithms under the premise of maintaining the accuracy of the model,which effectively reduces the communication cost.
federated learningclustering algorithmlocality principleproximal local update