首页|Backdoor Attacks on Image Classification Models in Deep Neural Networks

Backdoor Attacks on Image Classification Models in Deep Neural Networks

扫码查看
Deep neural network(DNN)is applied widely in many applications and achieves state-of-the-art performance.However,DNN lacks transparency and in-terpretability for users in structure.Attackers can use this feature to embed trojan horses in the DNN structure,such as inserting a backdoor into the DNN,so that DNN can learn both the normal main task and additional mali-cious tasks at the same time.Besides,DNN relies on data set for training.Attackers can tamper with training data to interfere with DNN training process,such as attaching a trigger on input data.Because of defects in DNN struc-ture and data,the backdoor attack can be a serious threat to the security of DNN.The DNN attacked by backdoor performs well on benign inputs while it outputs an attack-er-specified label on trigger attached inputs.Backdoor at-tack can be conducted in almost every stage of the ma-chine learning pipeline.Although there are a few re-searches in the backdoor attack on image classification,a systematic review is still rare in this field.This paper is a comprehensive review of backdoor attacks.According to whether attackers have access to the training data,we di-vide various backdoor attacks into two types:poisoning-based attacks and non-poisoning-based attacks.We go through the details of each work in the timeline,discuss-ing its contribution and deficiencies.We propose a de-tailed mathematical backdoor model to summary all kinds of backdoor attacks.In the end,we provide some insights about future studies.

Backdoor attackPoisoning-based at-tacksNon-poisoning-based attacksSecurityReview

ZHANG Quanxin、MA Wencong、WANG Yajie、ZHANG Yaoyuan、SHI Zhiwei、LI Yuanzhang

展开 >

Beijing Institute of Technology,Beijing 100036,China

China Information Technology Security Evaluation Center,Beijing 100085,China

国家自然科学基金

61876019

2022

电子学报(英文)

电子学报(英文)

CSTPCDSCIEI
ISSN:1022-4653
年,卷(期):2022.31(2)
  • 84