首页|基于ALBERT、Lattice IndyLSTM和Attention融合的用户意图分类

基于ALBERT、Lattice IndyLSTM和Attention融合的用户意图分类

扫码查看
旨在对话式智能系统中的用户意图识别任务,论文提出了一种新型的用户意图分类模型。该模型结合了AL-BERT预训练模型、格形式的独立循环长短期记忆网络(Lattice IndyLSTM)以及词注意力融合机制。通过构造一个由字嵌入和词嵌入组成的Lattice,输入至IndyLSTM网络实现对用户输入语句编码,能够处理传统意图分类任务中无法同时利用字词信息、RNN中存在梯度爆炸或者消失,LSTM模型过拟合等问题。此外,利用能够提高对用户输入语句中领域特定词汇的编码贡献度的词注意力机制,大大提升了意图识别准确性。通过实验得出,论文提出的用户意图分类模型能够有效提升精确率、召回率和F1值等指标。
User Intent Classification Based on ALBERT,Lattice IndyLSTM and Attention
A new user intent classification model is proposed for the task of intent recognition in conversational AI systems.This model combines the ALBERT pre-training model,Lattice IndyLSTM(Lattice Independently Recurrent Long Short Term Mem-ory)in a lattice form,and word-level attention fusion mechanism.By constructing a lattice composed of character embeddings and word embeddings and inputting it into the IndyLSTM network,this model can handle the challenges in traditional intent classifica-tion tasks,such as the inability to simultaneously utilize character and word information,the gradient explosion or vanishing in RNNs,and LSTM model overfitting.Furthermore,by utilizing the word-level attention mechanism that enhances the contribution of domain-specific vocabulary in user input sentences,the accuracy of intent recognition is greatly improved.Experimental results demonstrate that the proposed user intent classification model effectively enhances precision,recall,F1-score,and other perfor-mance metrics.

attention mechanismRNNintent classificationALBERTLattice IndyLSTM

吕海峰、冀肖榆、庞光垚、涂井先、黄芳香

展开 >

梧州学院广西机器视觉与智能控制重点实验室 梧州 543002

梧州学院广西高校行业软件技术重点实验室 梧州 543002

注意力机制 循环神经网络 意图分类 ALBERT Lattice IndyLSTM

国家自然科学基金地区科学基金广西自然科学基金面上项目梧州学院省级大学生创新创业训练计划立项项目

622620592021JJA170178S202211354104

2024

计算机与数字工程
中国船舶重工集团公司第七0九研究所

计算机与数字工程

CSTPCD
影响因子:0.355
ISSN:1672-9722
年,卷(期):2024.52(3)
  • 17