多任务学习在自然语言处理领域有广泛应用,但多任务模型往往对任务间的相关性比较敏感.如果任务相关性较低或信息传递不合理,可能会严重影响任务性能.本文提出了一种新的共享-私有结构的多任务学习模型BB-MTL(BERT-BiLSTM multi-task learning model),并借助元学习的思想为其设计了一种特殊的参数优化方式MLL-TM(meta-learning-like train methods).进一步引入一个新的信息融合门SoWLG(Softmax weighted linear gate),用于选择性地融合每项任务的共享特征与私有特征.实验验证所提出的多任务学习方法,考虑到用户在网络上的行为与其个体特征密切相关,文中结合了不良言论检测、人格检测和情绪检测任务进行了一系列实验.实验结果表明,BB-MTL能够有效学习相关任务中的特征信息,在3项任务上的准确率分别达到了81.56%、77.09%和70.82%.
Application of Multi-task Learning in Hate-speech and Individual Characteristics Detection
Multi-task learning is widely used in the field of natural language processing,but multi-task models tend to be sensitive to the relevance between tasks.If the task relevance is low or the information transfer is unreasonable,the task performance may be seriously affected.This study proposes a new shared-private structure multi-task learning model,BERT-BiLSTM multi-task learning(BB-MTL).It designs a special parameter optimization method,meta-learning-like train methods(MLL-TM)for the model with the help of meta-learning ideas.Further,a new information fusion gate,Softmax weighted linear gate(SoWLG),is introduced for selectively fusing the shared and private features of each task.To validate the proposed multi-task learning method,a series of experiments are conducted by combining the tasks of hate-speech detection,personality detection,and emotion detection,taking into account the fact that user behavior on the Internet is closely related to individual characteristics.The experimental results show that BB-MTL can effectively learn feature information in relevant tasks,and the accuracy rates reach 81.56%,77.09%,and 70.82%in the three tasks,respectively.