A Sentiment Classification Method Based on Fine-Tuning BERT Mixed Model
At present,most of the sentiment classification tasks use the traditional static word vector language model to obtain the context-related information of the text,but these methods can't solve the problem of polysemy and ambiguity caused by word segmentation,which leads to the low accuracy of sentiment classification.To solve these problems,this paper proposes a hybrid model BBLA(BERT-BILSTM-Attention)based on multi-feature information fusion attention mechanism and neural network.The purpose is to focus on the output layer of BERT(pre-trained language representation model)in the sentiment analysis task,perform the vectorization of short text,splice sentimental words as new features of part of speech into the word vector,highlight and obtain the potential sentimental informa-tion,and increase the position vector of sentimental words,thus solving the polysemy problem of sentimental words and the antonym problem of double negation.Then,the bidirectional LSTM(long short-term memory neural network)model and Attention(attention mechanism)are added to capture the bidirectional context semantic dependency infor-mation of the text respectively to solve the problem of the loss of individual sentimental words.Finally,Softmax is used to obtain the results of sentimental analysis.The experimental results show that the accuracy of the hybrid model pro-posed in this paper has been significantly improved.