首页|Byzantine-Robust and Communication-Efficient Personalized Federated Learning

Byzantine-Robust and Communication-Efficient Personalized Federated Learning

扫码查看
This paper explores constrained non-convex personalized federated learning (PFL), in which a group of workers train local models and a global model, under the coordination of a server. To address the challenges of efficient information exchange and robustness against the so-called Byzantine workers, we propose a projected stochastic gradient descent algorithm for PFL that simultaneously ensures Byzantine-robustness and communication efficiency. We implement personalized learning at the workers aided by the global model, and employ a Huber function-based robust aggregation with an adaptive threshold-selecting strategy at the server to reduce the effects of Byzantine attacks. To improve communication efficiency, we incorporate random communication that allows multiple local updates per communication round. We establish the convergence of our algorithm, showing the effects of Byzantine attacks, random communication, and stochastic gradients on the learning error. Numerical experiments demonstrate the superiority of our algorithm in neural network training compared to existing ones.

ServersStochastic processesSignal processing algorithmsData modelsComputational modelingVectorsFederated learningConvergenceAdaptation modelsRobustness

Jiaojiao Zhang、Xuechao He、Yue Huang、Qing Ling

展开 >

School of Computer Science and Engineering, Sun Yat-Sen University, Guangdong, China|Division of Decision and Control Systems, KTH Royal Institute of Technology, Stockholm, Sweden

School of Computer Science and Engineering, Sun Yat-Sen University, Guangdong, China

2025

IEEE transactions on signal processing

IEEE transactions on signal processing

ISSN:
年,卷(期):2025.73(1)
  • 50