首页|A novel abstractive summarization model based on topic-aware and contrastive learning
A novel abstractive summarization model based on topic-aware and contrastive learning
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Springer Nature
Abstract The majority of abstractive summarization models are designed based on the Sequence-to-Sequence(Seq2Seq) architecture. These models are able to capture syntactic and contextual information between words. However, Seq2Seq-based summarization models tend to overlook global semantic information. Moreover, there exist inconsistency between the objective function and evaluation metrics of this model. To address these limitations, a novel model named ASTCL is proposed in this paper. It integrates the neural topic model into the Seq2Seq framework innovatively, aiming to capture the text’s global semantic information and guide the summary generation. Additionally, it incorporates contrastive learning techniques to mitigate the discrepancy between the objective loss and the evaluation metrics through scoring multiple candidate summaries. On CNN/DM XSum and NYT datasets, the experimental results demonstrate that the ASTCL model outperforms the other generic models in summarization task.
Huanling Tang、Ruiquan Li、Wenhao Duan、Quansheng Dou、Mingyu Lu
展开 >
Shandong Technology and Business University||Co-innovation Center of Shandong Colleges and Universities: Future Intelligent Computing