Research on Machine Reading Comprehension Based on XLNet and Bidirectional Attention
The purpose of machine reading comprehension is to enable the machine to read and accurately understand a natu-ral language text,and answer a given question,which has high research and application value.Aiming at the problem of low accura-cy due to the lack of effective interaction information between articles and questions in the existing universal domain machine read-ing comprehension models,this paper proposes a reading comprehension model based on XLNet and bidirectional attention.In this model,XLNet pretraining language model is used to generate context-dependent word vectors for sequential representation of con-tent and problem at the embedding layer,and two layers of LSTM are used to extract semantic features at the coding layer,and two bidirectional attention mechanisms(Bi-Attention and Co-Attention)are used to extract sequence features at the interaction layer,and then the self-attention mechanism is used to further enhance the representation of text features,and vector fusion is carried out.Finally,the start and end positions of the answers are obtained at the input and output layer after bidirectional LSTM modeling.Ex-perimental results in DuReader Chinese dataset show that EM and F1 values are improved.