首页|PAFormer: Anomaly Detection of Time Series With Parallel-Attention Transformer

PAFormer: Anomaly Detection of Time Series With Parallel-Attention Transformer

扫码查看
Time-series anomaly detection is a critical task with significant impact as it serves a pivotal role in the field of data mining and quality management. Current anomaly detection methods are typically based on reconstruction or forecasting algorithms, as these methods have the capability to learn compressed data representations and model time dependencies. However, most methods rely on learning normal distribution patterns, which can be difficult to achieve in real-world engineering applications. Furthermore, real-world time-series data is highly imbalanced, with a severe lack of representative samples for anomalous data, which can lead to model learning failure. In this article, we propose a novel end-to-end unsupervised framework called the parallel-attention transformer (PAFormer), which discriminates anomalies by modeling both the global characteristics and local patterns of time series. Specifically, we construct parallel-attention (PA), which includes two core modules: the global enhanced representation module (GERM) and the local perception module (LPM). GERM consists of two pattern units and a normalization module, with attention weights that indicate the relationship of each data point to the whole series (global). Due to the rarity of anomalous points, they have strong associations with adjacent data points. LPM is composed of a learnable Laplace kernel function that learns the neighborhood relevancies through the distributional properties of the kernel function (local). We employ the PA to learn the global-local distributional differences for each data point, which enables us to discriminate anomalies. Finally, we propose a two-stage adversarial loss to optimize the model. We conduct experiments on five public benchmark datasets (real-world datasets) and one synthetic dataset. The results show that PAFormer outperforms state-of-the-art baselines.

Anomaly detectionTransformersFeature extractionTime series analysisData modelsComputational modelingPredictive models

Ningning Bai、Xiaofeng Wang、Ruidong Han、Qin Wang、Zinian Liu

展开 >

Department of Mathematics, Xi’an University of Technology, Xi’an, China

School of Computer Science and Engineering, Xi’an University of Technology, Xi’an, China|School of Mathematics and Information Technology, Yuncheng University, Yuncheng, China

School of Computer Science and Engineering, Xi’an University of Technology, Xi’an, China

2025

IEEE transactions on neural networks and learning systems
  • 100