首页|Accelerating BERT inference with GPU-efficient exit prediction

Accelerating BERT inference with GPU-efficient exit prediction

扫码查看
BERT is a representative pre-trained language model that has drawn extensive attention for significant improvements in downstream Natural Language Processing(NLP)tasks.The complex architecture and massive parameters bring BERT competitive performance but also result in slow speed at model inference time.To speed up BERT inference,FastBERT realizes adaptive inference with an acceptable drop in accuracy based on knowledge distillation and the early-exit technique.However,many factors may limit the performance of FastBERT,such as the teacher classifier that is not knowledgeable enough,the batch size shrinkage and the redundant computation of student classifiers.To overcome these limitations,we propose a new BERT inference method with GPU-Efficient Exit Prediction(GEEP).GEEP leverages the shared exit loss to simplify the training process of FastBERT from two steps into only one step and makes the teacher classifier more knowledgeable by feeding diverse Transformer outputs to the teacher classifier.In addition,the exit layer prediction technique is proposed to utilize a GPU hash table to handle the token-level exit layer distribution and to sort test samples by predicted exit layers.In this way,GEEP can avoid batch size shrinkage and redundant computation of student classifiers.Experimental results on twelve public English and Chinese NLP datasets prove the effectiveness of the proposed approach.The source codes of GEEP will be released to the public upon paper acceptance.

BERTFastBERTinference accelerationmodel distillationearly exittext classification

Lei LI、Chengyu WANG、Minghui QIU、Cen CHEN、Ming GAO、Aoying ZHOU

展开 >

Shanghai Engineering Research Center of Big Data Management,School of Data Science and Engineering,East China Normal University,Shanghai 200062,China

Alibaba Group,Hangzhou 311121,China

KLATASDS-MOE,School of Statistics,East China Normal University,Shanghai 200062,China

National Natural Science Foundation of ChinaNational Natural Science Foundation of ChinaNational Natural Science Foundation of ChinaNational Natural Science Foundation of ChinaAlibaba Group through the Alibaba Innovation Research Program

U1911203618770186197702562202170

2024

计算机科学前沿
高等教育出版社

计算机科学前沿

CSTPCDEI
影响因子:0.303
ISSN:2095-2228
年,卷(期):2024.18(3)
  • 37