首页|Overhead-free Noise-tolerant Federated Learning:A New Baseline
Overhead-free Noise-tolerant Federated Learning:A New Baseline
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
万方数据
Federated learning(FL)is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private.However,in real-world scenarios,the supervised training data stored in local cli-ents inevitably suffer from imperfect annotations,resulting in subjective,inconsistent and biased labels.These noisy labels can harm the collaborative aggregation process of FL by inducing inconsistent decision boundaries.Unfortunately,few attempts have been made to-wards noise-tolerant federated learning,with most of them relying on the strategy of transmitting overhead messages to assist noisy la-bels detection and correction,which increases the communication burden as well as privacy risks.In this paper,we propose a simple yet effective method for noise-tolerant FL based on the well-established co-training framework.Our method leverages the inherent discrep-ancy in the learning ability of the local and global models in FL,which can be regarded as two complementary views.By iteratively ex-changing samples with their high confident predictions,the two models"teach each other"to suppress the influence of noisy labels.The proposed scheme enjoys the benefit of overhead cost-free and can serve as a robust and efficient baseline for noise-tolerant federated learning.Experimental results demonstrate that our method outperforms existing approaches,highlighting the superiority of our meth-od.