现有的大多数研究者使用循环神经网络与注意力机制相结合的方法进行方面级情感分类任务.然而,循环神经网络不能并行计算,并且模型在训练过程中会出现截断的反向传播、梯度消失和梯度爆炸等问题,传统的注意力机制可能会给句子中重要情感词分配较低的注意力权重.针对上述问题,该文提出了一种融合Transformer和交互注意力网络的方面级情感分类模型.首先利用BERT(bidirectional encoder representation from Transformers)预训练模型来构造词嵌入向量,然后使用Transformer编码器对输入的句子进行并行编码,接着使用上下文动态掩码和上下文动态权重机制来关注与特定方面词有重要语义关系的局部上下文信息.最后在5个英文数据集和4个中文评论数据集上的实验结果表明,该文所提模型在准确率和F1 上均表现最优.
Abstract
At present,most researchers use a combination of recurrent neural networks and attention mechanisms for as-pect-level sentiment classification tasks.However,the recurrent neural network cannot be computed in parallel,and the models encounter problems,such as truncated backpropagation,gradient vanishing,and gradient exploration,in the training process.Traditional attention mechanisms may assign reduced attention weights to important sentiment words in sentences.An aspect-level sentiment classification model combining Transformer and interactive attention network is proposed to solve these problems.In this approach,the pretrained model,which considers bidirectional encoder repres-entation from Transformers(BERT),is initially used to construct word embedding vectors.Then,Transformer encoders are used to perform parallel encoding for input sentences.Subsequently,the contextual dynamic mask-off code and the contextual dynamic weighting mechanisms are applied to focus on local context information semantically relevant to specific aspect words.Finally,the model is tested on five English datasets and four Chinese review datasets.Experi-mental results demonstrate that the proposed model outperforms others in terms of accuracy and F1.