Text Classification Method Combining Attention Gated Geural Network and Stacking Algorithm
In order to better solve the problems of low generalization ability and lack of classification accuracy of traditional single classifier,and lack of focus on the key information in the text,a text classification model AGNN-Stacking is proposed that combines attention gated neural network(AGNN)and stacking algorithm.In terms of feature extraction,the AGNN model adopts the parallel structure of BiLSTM and BiGRU models with attention mechanism respectively,which can extract the key features in the text more abundantly.In text classification,the Stacking algorithm combines the advantages of five different base classifiers to improve the accuracy of text classification,and has better generalization ability and stability.On the ChineseNLPCorpus dataset,the experimental results show that this classification method can effectively improve the accuracy of text classification compared with the machine learning algorithm and single neural network model.
single classifieraccuracygated neural networkfeature extractiongeneralization ability