A study of text sentiment multi-classification model integrating BERT and contrastive learning
To more accurately identify the emotions of the masses,this study investigates the problem of multi-class text senti-ment analysis.Additionally,to address the long-tail issue arising from the imbalanced data distribution in multi-class text senti-ment analysis and enhance sentiment classification performance,a text sentiment multi-classification model integrating BERT and contrastive learning is proposed.Initially,the BERT model is employed to generate text embedding representation vectors,which are then fed into a projection layer to further capture essential features and reduce dimensionality.By treating sentiment texts of the same category as positive samples and sentiment texts of different categories as negative samples,the model performs contrastive learning between positive and negative samples.Ultimately,the model learns deeper feature representations related to long-tail sen-timent categories.Performance testing is conducted on the SMP2020 Weibo Sentiment Classification public dataset,and experimen-tal results demonstrate that the model achieves accuracy rates of 85%and 86.4%on the two datasets,respectively,outperforming some traditional models in classification performance.
text sentiment classificationBERT modelcontrastive learning