首页|Byzantine-Robust and Communication-Efficient Personalized Federated Learning
Byzantine-Robust and Communication-Efficient Personalized Federated Learning
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
IEEE
This paper explores constrained non-convex personalized federated learning (PFL), in which a group of workers train local models and a global model, under the coordination of a server. To address the challenges of efficient information exchange and robustness against the so-called Byzantine workers, we propose a projected stochastic gradient descent algorithm for PFL that simultaneously ensures Byzantine-robustness and communication efficiency. We implement personalized learning at the workers aided by the global model, and employ a Huber function-based robust aggregation with an adaptive threshold-selecting strategy at the server to reduce the effects of Byzantine attacks. To improve communication efficiency, we incorporate random communication that allows multiple local updates per communication round. We establish the convergence of our algorithm, showing the effects of Byzantine attacks, random communication, and stochastic gradients on the learning error. Numerical experiments demonstrate the superiority of our algorithm in neural network training compared to existing ones.
School of Computer Science and Engineering, Sun Yat-Sen University, Guangdong, China|Division of Decision and Control Systems, KTH Royal Institute of Technology, Stockholm, Sweden
School of Computer Science and Engineering, Sun Yat-Sen University, Guangdong, China