首页|混合BERT和宽度学习的低时间复杂度短文本分类

混合BERT和宽度学习的低时间复杂度短文本分类

扫码查看
针对短文本分类任务效率低下和精度不高的问题,提出混合基于Transformer的双向编码器表示和宽度学习分类器(hybrid bidirectional encoder representations from transformer and broad learning,BERT-BL)的高效率和高精度文本分类模型.对基于Transformer的双向编码器表示(bidirectional encoder representation from transformer,BERT)进行微调以更新BERT的参数.使用微调好的BERT将短文本映射成对应的词向量矩阵,将词向量矩阵输入宽度学习(broad learning,BL)分类器中以完成分类任务.试验结果显示,BERT-BL模型在 3 个公共数据集上的准确率均达到最优;所需要的时间仅为基线模型支持向量机(support vector machine,SVM)、长短期记忆网络(long short-term memory,LSTM)、最小p范数宽度学习(minimum p-norm broad learning,p-BL)和BERT的几十分之一,而且训练过程不需要高性能显卡的参与.通过对比分析,BERT-BL模型不仅在短文本任务中具有良好的性能,而且能节省大量训练时间成本.
Low time complexity short text classification based on fusion of BERT and broad learing
To address the issues of low efficiency and low accuracy in short text classification(STC)tasks,a high-efficiency and high-precision text classification model was proposed that combined transformer based on bidirectional encoder representations and broad learning classifiers(BERT-BL).Through the process of fine-tuning the bidirectional encoder representation from transformer(BERT)based on transformer,the parameters of BERT could be updated to optimize its performance.Utilized fine-tuned BERT to map the short text to its respective word vector matrix,then input it into the BL classifier to classify.The experimental results showed that the accuracy of the BERT-BL model reached state-of-art performance on three public datasets respectively.The main finding was that the BERT-BL model took only a few tenths of the time required to baseline models of support vector machine(SVM),long short-term memory(LSTM),minimum p-norm broad learning(p-BL)and BERT,and its training process did not require the participation of a graphics processing unit.Through comparative analysis,the BERT-BL model not only had good performance in STC,but also can save a lot of training time cost.

short text classificationBERT-BLBERTbroad learninghigh accuracy

陈晓江、杨晓奇、陈广豪、刘伍颖

展开 >

广东开放大学揭阳分校信息科,广东 揭阳 522095

广东外语外贸大学信息科学与技术学院,广东 广州 510006

广州软件学院软件工程系,广东 广州 510990

鲁东大学山东省语言资源开发与应用重点实验室,山东 烟台 264025

广东外语外贸大学外国语言学及应用语言学研究中心,广东 广州 510420

展开 >

短文本分类 BERT-BL BERT 宽度学习 高精度

教育部新文科研究与改革实践资助项目山东省研究生教育教学改革研究资助项目山东省本科教学改革研究重点资助项目教育部人文社会科学研究青年基金资助项目上海市哲学社会科学"十三五"规划课题资助项目教育部人文社会科学研究规划基金资助项目广州市科技计划资助项目

2021060049SDYJG21185Z202132320YJC7400622019BYY02820YJAZH069202201010061

2024

山东大学学报(工学版)
山东大学

山东大学学报(工学版)

CSTPCD北大核心
影响因子:0.634
ISSN:1672-3961
年,卷(期):2024.54(4)