知识增强的BERT短文本分类算法
Classification algorithm for short text based on knowledge enhanced BERT
傅薛林 1金红 1郑玮浩 1张奕 1陶小梅2
作者信息
- 1. 桂林理工大学信息科学与工程学院,广西桂林 541004;桂林理工大学广西嵌入式技术与智能系统重点实验室,广西 桂林 541004
- 2. 广西师范大学计算机科学与工程学院/软件学院,广西桂林 541004
- 折叠
摘要
为解决短文本信息不全且缺乏领域知识导致关键信息难以充分挖掘而造成的深度学习模型分类性能不足等问题,提出一种知识增强的双向编码器表示转换器(BERT)短文本分类算法(KE-BERT).提出一种建模短文本与领域知识的方法,通过知识图谱进行领域知识的引入;提出一种知识适配器,通过知识适配器在BERT的各个编码层之间进行知识增强.通过在公开的短文本数据集上,将KE-BERT与其它深度学习模型相比较,该模型的F1均值和准确率均值达到93.46%和91.26%,结果表明了所提模型性能表现良好.
Abstract
To solve the problem of poor classification performance of deep learning models caused by incomplete short text infor-mation and lack of domain knowledge,a knowledge enhanced bidirectional encoder representation from transformers(BERT)short text classification algorithm(KE-BERT)was proposed.A method of modeling short text and domain knowledge was pro-posed,and the domain knowledge was introduced by knowledge graph.A knowledge adapter was proposed to enhance knowledge between the encoding layers of BERT.By comparing KE-BERT with other deep learning models on the published short text dataset,the F1 mean and accuracy mean of this model reach 93.46%and 91.26%,indicating that the proposed model has good performance.
关键词
短文本分类/深度学习/双向编码器表示转换器/知识图谱/领域知识/知识适配器/知识增强Key words
short text classification/deep learning/bidirectional encoder representation from transformers/knowledge graph/domain knowledge/knowledge adapter/enhance knowledge引用本文复制引用
基金项目
国家自然科学基金青年科学基金项目(61906051)
广西科技计划基金项目(2020GXNSFAA297255)
出版年
2024