武汉大学自然科学学报(英文版)2023,Vol.28Issue(6) :474-482.DOI:10.1051/wujns/2023286474

Improve Code Summarization via Prompt-Tuning CodeT5

LI Huanzhen
武汉大学自然科学学报(英文版)2023,Vol.28Issue(6) :474-482.DOI:10.1051/wujns/2023286474

Improve Code Summarization via Prompt-Tuning CodeT5

LI Huanzhen1
扫码查看

作者信息

  • 1. College of Information Engineering,Jiangxi University of Technology,Nanchang 330022,Jiangxi,China
  • 折叠

Abstract

Code comments are crucial in software engineering,aiding in program maintenance and code reuse.The process of generating clear and descriptive code comments,outlining code functionality,is called code summarization.Existing code summarization methods are typically trained using transformer-based models.However,these trained models often possess limited parameters and lack specific train-ing tasks,hindering their ability to capture code semantics effectively.This paper uses a high-capacity pre-trained model,CodeT5,for code summarization.CodeT5 is designed with an encoder-decoder architecture that excels in code summarization tasks.Furthermore,we adopt a novel paradigm,"pre-train,prompt,predict",to unlock the knowledge embedded within CodeT5.We devise a prompt template to convert input code into code prompts and fine-tune CodeT5 with these prompts—a process we term prompt tuning.Our effectiveness experiments demonstrate that prompt tuning CodeT5 with only 40%of the dataset can achieve comparable performance to fine-tuning CodeT5 with 100%of the dataset.This means our approach is applicable in few-shot learning scenarios.Additionally,our prompt learning method is not sensitive to the size of the tuning dataset.Our practicality experiments show that the performance of prompt-tuned CodeT5 far surpasses that of transformer-based models trained on code-comment datasets collected from Stack Overflow.

Key words

code summarization/transformer-based model/prompt learning/CodeT5/few-shot learning

引用本文复制引用

出版年

2023
武汉大学自然科学学报(英文版)
武汉大学

武汉大学自然科学学报(英文版)

CSTPCDCSCD
影响因子:0.066
ISSN:1007-1202
参考文献量39
段落导航相关论文