Research on Privacy Protection Method of Intelligent Networking Based on Granular Gradient Perturbation
In intelligent networked transportation systems,federated learning can improve privacy protection of transportation data by enabling roadside infrastructures to execute distributed local training and global aggregation under the scheduling of an edge server,but there still exists privacy leakage risk.The attackers can use gradient leakage attack to recover the training transportation data of the roadside infrastructures given their shared model parameters.This paper proposes a granular gradient perturbation method based on dif-ferential privacy theory and information theory to defend the gradient leakage attack.The defense method selects the neurons with low Fisher information value and adds designed Laplace noise into their corresponding gradients to perturb the data reconstruction by the at-tackers.The theoretical analysis is provided to validate that the proposed defense method satisfies the differential privacy protection and training convergence.Experimental results demonstrate that the proposed defense method effectively defends against the gradient leakage attack,while the training accuracy of federated learning keeps above 90%,which is better than the overall gradient perturbation method and the random gradient perturbation method.
federated learningintelligent networkeddifferential privacyinformation entropy theory