Chinese Knowledge Graph Question Answering Method Based on ChineseBERT
In order to solve the problem of poor question-answering performance caused by complex Chinese characters and semantic information,a method called Chinesebert-KBQA was proposed.Chinese pre-trained language model ChineseBERT was used as the semantic embedding layer of the text and the font and pinyin information were integrated.It improves the performance of traditional semantic parsing methods in the sub-tasks of entity reference recognition and relationship prediction.Specifically,the two models proposed were ChineseBERT-CRF based entity reference recognition model and ChineseBERT-TextCNN-Softmax based relationship prediction model,which comprehensively improved the semantic understanding ability of Chinese text.Finally,combined with the relevant information among the sub-tasks,the final answer prediction was carried out.The MOOC Q&A and NLPCC2018,both educational question-answering data sets,were used for experiments.The experimental results show the effectiveness and accuracy of the proposed method in Chinese knowledge graph question-answering.
knowledge graphBERTintelligent question answeringdeep learningpretraining language models