Federated Learning Optimization Method in Non-IID Scenarios
Federated Learning(FL)can collaborate to train global models without compromising data privacy.Nonetheless,this collaborative training approach faces the challenge of Non-IID in the real world;slow model convergence and low accuracy.Numerous existing FL methods improve only from one perspective of global model aggregation and local client update,and inevitably will not cause the impact of the other perspective and reduce the quality of the global model.In this context,we introduce a hierarchical continuous learning optimization method for FL,denoted as FedMas,which is based on the idea of hierarchical fusion.First,clients with similar data distribution are divided into different layers using the DBSCAN algorithm,and only part of clients of a certain layer are selected for training each time to avoid weight differences caused by different data distributions when the server global model is aggregated.Further,owing to the different data distributions of each layer,the client combines the solution of continuous learning catastrophic forgetting during local update to effectively integrate the differences between the data of different layers of clients,thus ensuring the performance of the global model.Experiments on MNIST and CIFAR-10 standard datasets demonstrate that the global model test accuracy is improved by 0.3-2.2 percentage points on average compared with FedProx,Scaffold,and FedCurv FL algorithms.
Federated Learning(FL)continual learningdata heterogeneityclusteringhierarchical optimizationdata distribution