Research on Knowledge Enhanced Text Understanding Based on Heterogeneous Graph Neural Network
Nowadays,popular language models such as BERT have good reading comprehension ability because of the pre-training of large-scale corpus.However,due to the sparseness of knowledge co-occurrence in the sentences,it is difficult for the model to understand scenes with complex semantics.To solve this problem,a method of introducing knowledge graph combined with heterogeneous graphs to enhance reading comprehension is proposed.This method constructs heterogeneous graphs based on the entities in the sentence,and generates sub-views based on the edge-dropout.The R-GCN aggregates the neighbor information in the heterogeneous sub-views,and constrains the similarity-based metric loss of the same node in different views.Finally,the knowledge-enhanced textual representation is obtained.The results on the WinoGrande show that this method has a significant im-provement in accuracy,which is 1.3%higher than without knowledge enhancement.
knowledge graphheterogeneous graph neural networkpre-trained language modelreading comprehensionnatural language processing