Automatic Text Summary Generation Based on Transformer Model
This paper discusses the automatic generation technology of text summarization,whose task is to generate a concise summary which can express the main meaning of text.The traditional Seq2Seq structural model has limited ability to capture and store long-term features and global features,resulting in a lack of important information in the generated abstract.Therefore,this paper proposes a new abstractive summarization model called RC-Transformer-PGN(RCTP)based on the Transformer model.The model first uses an additional encoder based on bidirectional GRU to improve the Transformer model to capture sequential context representation and improve the ability to capture local information.Secondly,it introduces Pointer Generation Network and Cover-age mechanism to alleviate the problem of Out-Of-Vocabulary words and repeated words.The experimental results on CNN/Daily Mail dataset show that our proposed model is more effective than the baseline model.