首页|融合情感和常识知识的对话生成模型

融合情感和常识知识的对话生成模型

扫码查看
随着深度学习技术的发展,开放域对话系统作为人机对话系统的重要分支也得到了快速发展.但目前开放域对话模型生成的回复语句依然存在同理心较差、多样性较低等问题.对此,提出一种融合情感和常识知识的对话生成模型.首先依据情感词典和常识知识图谱获取每个单词对应的常识知识向量,然后将该向量和单词本身的词嵌入向量一同输入编码器中进行编码,接着通过两阶段解码来生成回复语句:第一个解码阶段预测要生成单词的情感强度,并据此获得该单词对应的情感向量,第二阶段解码结合第一阶段编码的结果和已生成单词的词嵌入向量及其对应的常识知识向量作为输入,预测要生成的单词.实验结果表明,该模型生成的回复语句更具同理心和多样性,并且在PPL,BLEU,ACC和DISTINCT等指标上相比基线模型都有一定提升.
Dialogue Generation Model Integrating Emotional and Commonsense Knowledge
With the development of deep learning technology,as an important branch of human-machine dialogue system,open do-main dialogue system has also developed rapidly.However,there are still problems such as poor empathy and low diversity in re-sponse sentences generated by existing dialogue models in open domains.To address these problems,a dialogue generation model integrating emotional and commonsense knowledge is proposed in this paper.Commonsense knowledge vector corresponding to each word is firstly obtained based on the emotion dictionary and commonsense knowledge graph,and the vector is input into the encoder for encoding along with the word embedding vector of the word itself.Then a two-stage decoding process is used to ge-nerate response sentence:the first decoding stage is to predict the emotional intensity of the word to be generated and obtain the corresponding emotional vector for that word based on it,the second decoding stage combines the encoding result of the first stage with the word embedding vector of the generated word and its corresponding common sense knowledge vector as input to predict the word to be generated.Experimental results show that the response sentences generated by the proposed model are more empathetic and diverse,and it has a certain improvement in PPL,BLEU,ACC and DISTINCT evaluation compared with the baseline models.

Dialogue modelEmotional dictionaryCommonsense knowledge graphTwo-stage decodingEmotional intensity

程金凤、蒋宗礼

展开 >

北京工业大学信息学部 北京 100124

对话模型 情感词典 常识知识图谱 两阶段解码 情感强度

2025

计算机科学
重庆西南信息有限公司(原科技部西南信息中心)

计算机科学

北大核心
影响因子:0.944
ISSN:1002-137X
年,卷(期):2025.52(1)