Overview of Controlled Text Generation Based on Pre-trained Models
Natural language generation(NLG),a branch of artificial intelligence,has seen significant progress in recent years,particularly with the development of Pre-trained language models(PLMs).NLG aims to generate coherent and meaningful text based on various input sources such as texts,images,tables,and knowledge bases.Researchers have enhanced the performance of PLMs through methods like archi-tectural expansion,fine-tuning,and prompt learning.However,NLG still faces challenges in dealing with unstructured inputs and generating text in low-resource languages,especially in environments lacking sufficient training data.This study explores the latest developments in NLG,its application prospects,and the challenges it faces.By analyzing existing literature,we propose strategies to improve the performance of PLMs and anticipate future research directions.Our findings indicate that despite limitations,NLG has shown potential in areas such as con-tent creation,automated news reporting,and conversational systems.The conclusion is that,with technological advancements,NLG will play an increasingly significant role in natural language processing and other related fields of artificial intelligence.
artificial intelligencenatural language generationcontrolled text generationpre-trained language modelsprompt learning