Temporal point processes have emerged as an important method for modeling asynchronous event sequences,finding wide applications in fields such as seismic events and healthcare.The introduction of deep learning models like Transformer has led to breakthroughs in predictive performance.To address the learning bias issue in Transformer-based Hawkes process models for event sequence modeling,a multi-branch weighted Transformer Hawkes process model is pro-posed.Inspired by the multi-branch concept,this model assigns varying importance to the learned dependencies from dif-ferent perspectives,thereby enhancing the modeling capability for event sequences.To overcome the limited local percep-tion of the Transformer-based Hawkes process model,a causal convolution-based local perception enhancement network is constructed,improving the model's attention to local contextual information in event sequences.In this paper,through experiments on multiple synthetic and real-world datasets,comprehensive evaluations are conducted using metrics such as log-likelihood,root mean squared error in time,and event type accuracy.The experimental results validate that the pro-posed model outperforms other benchmark models.Furthermore,ablation experiments confirm the effectiveness of the lo-cal perception enhancement network.
temporal point processHawkes processdeep learningTransformermulti-branch weighted