首页|基于EABMD的小样本多层次分类方法

基于EABMD的小样本多层次分类方法

扫码查看
针对小样本多层次文本数据具有高度复杂性和多样性的特点,提出了一种基于ERNIE-Attention BiGRU-Multi-Channel CNN With Dilated Conv 文本分类模型.该模型融合了 ERNIE 的语言表示能力、局部注意力增强的BiGRU以及带空洞卷积的多通道CNN.利用带局部注意力机制的BiGRU模型可以更好的提取出全局特征文本特征,利用带空洞卷积的多通道CNN可以提取出文本的局部特征信息.最后,将提取到的两部分特征通过拼接融合后作为多个全连接层的输入,计算分类结果.实验结果表明,该模型在自建数据集上相较于第二优的模型分类的准确率、Micro-F1和Macro-F1分别提高了5.31%、1.19%和9.1%,测试集损失率为0.016,时间降低了 0.69 ms,并且在公共数据集中,测试集的准确率依然能够保证88.21%,损失率为0.015.在小样本多层次分类领域中取得了良好的效果.
Small Sample Multi-level Classification Method Based on EABMD
In view of the high complexity and diversity of small-sample multi-level text data,we propose a text classification model based on ERNIE-Attention BiGRU-Multi-Channel CNN with Dilated Conv.This model integrates the linguistic representation capability of ERNIE,the local attention-enhanced BiGRU,and the multi-channel CNN with dilated convolution.The BiGRU model with local attention mechanism is utilized to better extract global textual features,while the multi-channel CNN with dilated convolution is employed to capture the local features of the text.Subsequently,the features extracted from these two parts are concatenated and fed into multiple fully connected layers to compute the classification results.Experimental results demonstrate that,on a custom dataset,our model outperforms the second-best model in terms of accuracy,Micro-F1,and Macro-F1 by 5.31%,1.19%,and 9.1%respectively,with a test set loss rate of 0.016 and a time reduction of 0.69 ms.Moreover,on public datasets,the accuracy of the test set still reaches 88.21%with a loss rate of 0.015.Our model has achieved commendable results in the field of small-sample multi-level classification.

small sampleERNIE modelBiGRU neural networkMulti-channel dilated convolutiondeep learning

申永康、朱全银、孙纪舟、史宇泉、刘楚涵

展开 >

淮阴工学院计算机与软件工程学院,江苏淮安 223001

小样本多层次 ERNIE模型 BiGRU神经网络 多通道空洞卷积 深度学习

2024

西安文理学院学报(自然科学版)
西安文理学院

西安文理学院学报(自然科学版)

影响因子:0.209
ISSN:1008-5564
年,卷(期):2024.27(3)