首页|基于Graph Attention的双通道中文文本摘要生成

基于Graph Attention的双通道中文文本摘要生成

扫码查看
传统的中文生成式摘要方法未充分考虑中文文本字特征和词特征之间的不同含义,容易对原文内容的信息做出错误理解。提出一种基于Graph Attention的双通道中文文本摘要生成方法,采用双通道的编码器结构分别提取文本的字级和词级特征,并通过Graph Attention提取文本对应的三元组集合特征,进行融合之后输入到带copy机制的解码端进行解码,从而提升原始文本的信息提取能力。对比实验结果表明,该方法在两个数据集上都有较好的表现。
DUAL-CHANNEL TEXT SUMMARIZATION GENERATION METHOD BASED ON GRAPH ATTENTION
The traditional extraction summarization method fails to fully consider the different meanings between character features and word features in the Chinese environment,and it is easy to misunderstand the information of the original text.In this paper,a dual-channel Chinese text summarization generation method based on graph attention is proposed.The two-channel encoder structure was used to extract the character-level and word-level characteristics of text respectively.The method built a three-way structure for each text and extracted the characteristics of triplet sets corresponding to the text through the graph attention.The feature fusion was entered into the decoder with the copy mechanism for summarization generation.The method could effectively improve the information extraction ability of the original text.The results of the comparison experiments show that the method performs better on both datasets.

Text summarizationAttention mechanismAbstractive summarization

曹渝昆、徐越

展开 >

上海电力大学计算机科学与技术学院 上海 200090

文本摘要 注意力机制 生成式摘要

国家自然科学基金项目

61802249

2024

计算机应用与软件
上海市计算技术研究所 上海计算机软件技术开发中心

计算机应用与软件

CSTPCD北大核心
影响因子:0.615
ISSN:1000-386X
年,卷(期):2024.41(4)
  • 17