Multi-label Text Classification Based on Relationship Mining and Adversarial Training
Traditional multi-label text classification methods ignore the label semantics and do not fully exploit the relation-ship between text and label as well as between label and label.In this paper,a multi-label text classification model is proposed based on relationship mining and adversarial training to solve the above problems.The BERT model and Graph Attention Network(GAT)are used to extract the semantic information of the text and mine the relationship between labels,respectively.First,the text is encoded using the BERT model to obtain semantic information of the text.Then,GAT is used to mine the relationships between la-bels to better understand the dependencies between labels.To further mine the relationship between text and learnable label embed-dings,the model employs a multi-head self-attention mechanism.Moreover,to improve the robustness of the model,the R-drop strategy is used for model training in this paper.Experimental results on AAPD and RCV1 datasets show that the proposed model not only focuses on textual information,but also effectively captures the dependencies between text and labels and the relationships be-tween labels to achieve better performance compared to some of the current mainstream multi-label text classification models.