In recent years,capsule neural networks(Capsnets)has been successfully applied to text classification due to its powerful ability in text feature learning.In previous researches,all the extracted text n-gram features play equal roles in text classification.It is ignored that the importance of each n-gram feature corresponding to a word should bo determined by the specific context.This strategy will directly affect the semantic understanding of model to the whole input text.Based on this,this paper proposes Partially-connected Routings Capsnet with Multi-scale Feature Attention(MulPart-Capsnets),which incorporates multi-scale feature attention into Capsnets.Multi-scale feature attention can automatically select n-gram features from different scales,and capture accurately rich n-gram features for each word by weighted sum rules.In addition,in order to reduce the redundant information transferring between child and parent capsules,dynamic routing algorithm is improved too.In order to verify the effectiveness of the proposed model,our experiments are conducted on seven well-known datasets in text classification.The experimental results demonstrates that the proposed model consistently improves the performance of classification and is able to capture more rich n-gram features of text and possess powerful ability of feature learning.
胶囊神经网络多尺度特征注意力文本分类路由算法卷积神经网络
王超凡、琚生根、孙界平、陈润
展开 >
计算机学院 四川大学 成都610065
胶囊神经网络 多尺度特征注意力 文本分类 路由算法 卷积神经网络
Chinese National Conference on Computational Linguistic
Haikou(CN)
19th Chinese National Conference on Computational Linguistic