UNSUPERVISED CONTRASTIVE DISTILLATION MODEL WITH HETEROGENEOUS SETTINGS FOR SENTIMENT CLASSIFICATION
With the rapid development of Internet technology,the number of online reviews is increasing.So,sentiment analysis on these reviews is of important research value.Since the BERT model was proposed,pre-trained language models became a common method for sentiment analysis tasks.However,these models had a large number of parameters and took long inference time.The common sentiment analysis methods were some simple neural network models.These models had fast training speed and strong deployability,but mediocre effect.Therefore,combining the advantages of two types of methods,this paper proposes an unsupervised comparative distillation model with heterogeneous settings.With same dataset settings and computing resources,our model reduces the number of parameters of the BERT model by 146 times,and the reasoning time is reduced by 207 times.Compared with DistilBERT,the amount of distillation parameters is reduced by 88 times,the inferencing time is reduced by 42.3 times,and the performance is increased by 1.8%(68.3%vs 70.1%).
Natural language processingSentiment classificationTransfer learningKnowledge distillation