Small Sample Multi-level Classification Method Based on EABMD
In view of the high complexity and diversity of small-sample multi-level text data,we propose a text classification model based on ERNIE-Attention BiGRU-Multi-Channel CNN with Dilated Conv.This model integrates the linguistic representation capability of ERNIE,the local attention-enhanced BiGRU,and the multi-channel CNN with dilated convolution.The BiGRU model with local attention mechanism is utilized to better extract global textual features,while the multi-channel CNN with dilated convolution is employed to capture the local features of the text.Subsequently,the features extracted from these two parts are concatenated and fed into multiple fully connected layers to compute the classification results.Experimental results demonstrate that,on a custom dataset,our model outperforms the second-best model in terms of accuracy,Micro-F1,and Macro-F1 by 5.31%,1.19%,and 9.1%respectively,with a test set loss rate of 0.016 and a time reduction of 0.69 ms.Moreover,on public datasets,the accuracy of the test set still reaches 88.21%with a loss rate of 0.015.Our model has achieved commendable results in the field of small-sample multi-level classification.
small sampleERNIE modelBiGRU neural networkMulti-channel dilated convolutiondeep learning