现代计算机2024,Vol.30Issue(24) :16-22.DOI:10.3969/j.issn.1007-1423.2024.24.003

基于Transformer的预训练模型综述

A survey of pre-training models based on Transformer

杨斌
现代计算机2024,Vol.30Issue(24) :16-22.DOI:10.3969/j.issn.1007-1423.2024.24.003

基于Transformer的预训练模型综述

A survey of pre-training models based on Transformer

杨斌1
扫码查看

作者信息

  • 1. 四川大学锦江学院计算机学院,眉山 620860
  • 折叠

摘要

通过基于Transformer的预训练模型展开全面综述,以探讨该技术的实际应用和潜在优势.该技术在自然语言处理、计算机视觉、语音处理和学科交叉应用等领域引起了广泛关注.通过综合分析已有的文献,总结出该技术的最新进展和现有问题.采用系统性的文献综述方法,对相关文献、资料进行了全面的调研和梳理.对基于Transformer的预训练模型进行综述,并指出未来研究的方向和重点.

Abstract

By providing a comprehensive review of Transformer-based pre-training models,this paper explored their practical applications and potential advantages.With the increasing attention in fields such as natural language processing,computer vision,speech processing,and interdisciplinary application,this technology has gained significant importance.The aim is to summarize the latest progress and existing issues of this technology through a comprehensive analysis of existing literature.It adopted a sys-tematic literature review methodology to conduct an extensive survey and analysis of relevant literature.It provided a review of Transformer-based pre-training models,and pointed out future research directions and priorities.

关键词

Transformer/预训练模型/自然语言处理/语音处理/学科交叉应用

Key words

Transformer/pre-training model/natural language processing/speech processing/interdisciplinary application

引用本文复制引用

出版年

2024
现代计算机
中大控股

现代计算机

影响因子:0.292
ISSN:1007-1423
段落导航相关论文