首页|结合CNN和BiGRU的双通道短文本意图识别算法

结合CNN和BiGRU的双通道短文本意图识别算法

扫码查看
在短文本意图识别领域,卷积神经网络(CNN)因其在局部信息提取方面的优异性能而备受关注.然而,由于其难以捕捉短文本语料的全局特征,因此存在一定局限性.针对该问题,本文结合TextCNN和BiGRU-att的优点提出一个双通道短文本意图识别模型,利用局部特征和全局特征更好地识别短文本的意图,弥补模型对文本整体特征的不足.AB-CNN-BGRU-att模型首先利用ALBERT多层双向Transformer结构对输入的文本向量化,再将向量分别送入TextCNN和BiGRU网络模型以获取局部和全局特征.将这两种特征进行融合,并通过全连接层并输入Softmax函数得到意图标签.实验结果表明,在THUCNews_Title数据集上,本文提出的AB-CNN-BGRU-att算法准确率(Acc)达到了96.68%,F1值达到了96.67%,相较于其他常用意图识别模型表现出更佳的性能.
Dual Channel Short Text Intent Recognition Algorithm Combining CNN and BiGRU
In the field of short-text intent recognition,convolutional neural networks(CNN)have garnered considerable attention due to their outstanding performance in extracting local information.Nevertheless,their limitations arise from the difficulty in capturing the global features of short-text corpora.To address this issue,this study combines the strengths of TextCNN and BiGRU-att to propose a dual-channel short-text intent recognition model,aiming to better recognize the intent of short texts by leveraging both local and global features,thereby compensating for the model's inadequacies in capturing overall text features.The AB-CNN-BGRU-att model initially utilizes an ALBERT multi-layer bidirectional Transformer structure to vectorize the input text and subsequently feeds these vectors separately into TextCNN and BiGRU network models to extract local and global features,respectively.The fusion of these two types of features,followed by passing through fully connected layers and inputting into the Softmax function,yields the intent labels.The experimental results demonstrate that on the THUCNews_Title dataset,the proposed AB-CNN-BGRU-att algorithm achieves an accuracy(Acc)of 96.68%and an F1 score of 96.67%,exhibiting superior performance compared with other commonly used intent recognition models.

intention recognitionALBERTBiGRUdual channel

王超、孙喁喁、徐飞、马媛媛、文雯、汪露

展开 >

西安工业大学计算机科学与工程学院,西安 710021

意图识别 ALBERT BiGRU 双通道

新型网络与检测控制国家联合地方工程实验室项目

GSYSJ2018013

2024

计算机系统应用
中国科学院软件研究所

计算机系统应用

CSTPCD
影响因子:0.449
ISSN:1003-3254
年,卷(期):2024.33(5)
  • 19