Fine-grained Intent Recognition from Pediatric Medical Dialogues with Contrastive Learning
The foundation of the inquiry dialogue system is rooted in natural language understanding(NLU),where NLU involves the extraction of intent and entity information from conversational data,transforming it into a structured representation.This process primarily encompasses two tasks:intent recognition and slot filling.Intent recognition,a typical text classification task,aims to discern the underlying purpose of the dialogue,while slot filling utilizes sequential algorithms to extract corresponding slot values based on predefined positions within the conversation.Conventional approaches often build separate models for intent recognition and slot filling,subsequently performing slot filling based on the recognized intent.However,this methodology is susceptible to error propagation.To address this issue,this paper proposes a fine-grained intent recognition method that integrates dialogue intent classification and semantic slot value extraction using a contrastive learning approach.The method combines intent classification and slot value tasks,leveraging BART as the backbone model for improvement and innovation.This model,employing an encoder-decoder architecture,shares an encoding layer for intent recognition and slot filling tasks.Additionally,it adopts character-level labels in the decoding layer,thereby integrating intent information into the slot filling task.Contrastive learning is introduced during the sample construction process.Experimental results demonstrate that the proposed algorithm achieves an intent recognition accuracy of 81.96%and a slot filling F1 score of 85.26%on a medical dialogue dataset,showing significant performance improvements compared with other algorithms.The paper also conducts ablation experiments on contrastive learning,historical information,and sentence-level intent to further substant the effectiveness of the proposed method.