Transformer Tracking Algorithm Integrating Fast Edge Attention
In order to solve the problems of model degradation and tracking drift in long-term target tracking,a Trans-former tracking algorithm TransFEA(fast edge attention on Transformer)that integrates fast edge attention is proposed.It uses ResNet-50 as the backbone network of the Siamese network,and introduces an attention network at the back end of each residual block for feature extraction to enhance the key information and global information of the target;edge atten-tion network(EA)extracts the feature vectors of the templates and the search area,fast attention network(FA)calculates the attention response value and determines the similarity between the two areas to adjust the target position.Designing a multi-layer perceptron to predict bounding boxes and avoid excessive hyperparameters enables the tracker to achieve a balance between accuracy and lightweight.Experimental results show that the success rate and accuracy rate of TransFEA on the LaSOT data set are 65.3%and 69.1%respectively,and the operation can reach 90 FPS,which improves the success rate and accuracy rate of long-term tracking.