A short text semantic matching strategy based on BERT sentence vector and differential attention
Short text semantic matching is a core issue in the field of natural language processing,which can be widely used in automatic question answering,search engines,and other fields.In the past,most of the work only considered the similar parts between texts,while ignoring the different parts be-tween texts,making the model unable to fully utilize the key information to determine whether texts match.In response to the above issues,this paper proposes a short text semantic matching strategy based on BERT sentence vectors and differential attention.BERT is used to vectorize sentence pairs,BiLSTM is used,and a multi-header differential attention mechanism is introduced to obtain attention weights that represent intention differences between the current word vector and the global semantic in-formation of the text.A one-dimensional convolutional neural network is used to reduce the dimension of the semantic feature vectors of the sentence pairs,Finally,the word sentence vector is spliced and sent to the full connection layer to calculate the semantic matching degree between the two sentences.Experiments on LCQMC and BQ datasets show that this strategy can effectively extract text semantic difference information,thereby enabling the model to display better results.
short text semantic matchingword sentence vectorrepresent intentiondifference notice