Dialogue-level Relation Extraction Based on Attention and Coreference
The dialog-level relation extraction is characterized by casual language,low information density and abun-dant personal pronouns.This paper proposes an end-to-end dialogue relation extraction model via TOD-BERT(Task-Oriented Dialogue BERT)pre-trained language model.It adopts the attention mechanism to capture the inter-action between different words and different relations.Besides,the co-reference information related to personal pro-nouns is applied to enrich the entity features.Validated on DialogRE,a new dialog-level relational extraction dataset,the proposed model reaches 63.77 F1 score,which is significantly better than the baseline models.