首页|基于前景对象检测和回归的视频异常检测方法

基于前景对象检测和回归的视频异常检测方法

扫码查看
视频异常检测在智能安防领域具有广泛的应用。基于生成模型的方法以其强大的生成能力受到学术界广泛关注。然而,这类方法通常涉及较多的参数,且往往依赖于大量的训练数据,这限制了其在实际应用场景中的适用性。本文提出了一种基于前景对象检测和回归的视频异常检测方法(FODR-VAD)。首先,利用目标检测器检测前景对象并构建以对象为中心的时空立方体。其次,采用随机乱序的方法构造伪异常数据。最后,将单分类视频异常检测问题转换为回归任务,在有监督学习范式下优化特征表示。在模型训练参数量小于 1M,使用不到一半训练集的前提下,所提出的方法在UCSD Ped2、CUHK Avenue和ShanghaiTech数据集上的Micro-AUC分别是 99。09%、88。16%和 78。47%。结果表明,所提出方法在保证较高异常检测能力的同时,可显著降低对训练数据的需求量。
Foreground Object Detection and Regression-based Video Anomaly Detection Method
Video anomaly detection finds wide applications in the field of intelligent security.Methods based on generative models have garnered extensive attention in academia due to their powerful generative capabilities.However,such methods typically involve a large number of parameters and often rely on a vast amount of training data,limiting their applicability in real-world scenarios.This paper proposes a video anomaly detection method based on foreground object detection and regression(FODR-VAD).Firstly,foreground objects are detected using an object detector,and spatiotemporal cubes centered around these objects are constructed.Secondly,pseudo-anomalous data is created using a random shuffling approach.Finally,the video anomaly detection problem is transformed into a regression task,optimizing feature representation under the supervised learning paradigm.With the model parameter count less than 1 million and using less than half of the training set,the proposed method achieves Micro-AUC scores of 99.09%,88.16%,and 78.47%on the UCSD Ped2,CUHK Avenue,and ShanghaiTech datasets,respectively.The results demonstrate that the proposed method significantly reduces the requirement for training data while ensuring high anomaly detection capability.

video anomaly detectionpseudo anomalysupervised learningregressionspatio-temporal cube

肖剑、刘天元、吴祥、吉根林

展开 >

南京师范大学计算机与电子信息学院/人工智能学院,江苏 南京 210023

香港理工大学工业及系统工程学系,香港 999077

视频异常检测 伪异常 监督学习 回归 时空立方体

国家自然科学基金

41971343

2024

南京师大学报(自然科学版)
南京师范大学

南京师大学报(自然科学版)

CSTPCD北大核心
影响因子:0.427
ISSN:1001-4616
年,卷(期):2024.47(2)
  • 1