Aspect-level sentiment analysis incorporating BERT and multiple attention
In light of the current BERT integrated aspect-level sentiment analysis model,only the output of BERT's final hid-den layer is utilized,disregarding the semantic information present in its intermediate hidden layers.To address this limitation,we propose a BERT Convolution multi-attention(BC-MAT)model that combines BERT with multiple attention mechanisms.This model employs a multi-scale Convolution neural network(MSC)to extract emotional features from all hidden layers of BERT.Sub-sequently,through content attention and hidden layer attention mechanisms,we explore the semantic relationships between words in sentences and BERT's hidden layers.Finally,by integrating these two attention mechanisms,we calibrate the input emotional characteristics and compensate for the shortcomings associated with using a single attention mechanism.Experimental results on four datasets-SemEval2014 Task4 and ARTS(aspect robustness test set)-demonstrate that our proposed model enhances the accu-racy of aspect-level sentiment analysis tasks.
natural language processingaspect sentiment analysisBERTmutil-attentionfeature fuse