Temporal point processes have emerged as an important method for modeling asynchronous event sequences,finding wide applications in fields such as seismic events and healthcare.The introduction of deep learning models like Transformer has led to breakthroughs in predictive performance.To address the learning bias issue in Transformer-based Hawkes process models for event sequence modeling,a multi-branch weighted Transformer Hawkes process model is pro-posed.Inspired by the multi-branch concept,this model assigns varying importance to the learned dependencies from dif-ferent perspectives,thereby enhancing the modeling capability for event sequences.To overcome the limited local percep-tion of the Transformer-based Hawkes process model,a causal convolution-based local perception enhancement network is constructed,improving the model's attention to local contextual information in event sequences.In this paper,through experiments on multiple synthetic and real-world datasets,comprehensive evaluations are conducted using metrics such as log-likelihood,root mean squared error in time,and event type accuracy.The experimental results validate that the pro-posed model outperforms other benchmark models.Furthermore,ablation experiments confirm the effectiveness of the lo-cal perception enhancement network.
关键词
时序点过程/霍克斯过程/深度学习/转换器/多分支加权
Key words
temporal point process/Hawkes process/deep learning/Transformer/multi-branch weighted