首页|GraphFlow+:Exploiting Conversation Flow in Conversational Machine Comprehension with Graph Neural Networks
GraphFlow+:Exploiting Conversation Flow in Conversational Machine Comprehension with Graph Neural Networks
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
万方数据
The conversation machine comprehension(MC)task aims to answer questions in the multi-turn conversation for a single passage.However,recent approaches don't exploit information from historical conversations effectively,which results in some references and ellipsis in the current question cannot be recognized.In addition,these methods do not consider the rich semantic relationships between words when reasoning about the passage text.In this paper,we propose a novel model GraphFlow+,which constructs a context graph for each conversation turn and uses a unique recurrent graph neural network(GNN)to model the temporal dependencies between the context graphs of each turn.Specifically,we exploit three different ways to construct text graphs,including the dynamic graph,stat-ic graph,and hybrid graph that combines the two.Our experiments on CoQA,QuAC and DoQA show that the GraphFlow+model can outperform the state-of-the-art approaches.
Conversational machine comprehension(MC)reading comprehensionquestion answeringgraph neural networks(GNNs)natural language processing(NLP)
Jing Hu、Lingfei Wu、Yu Chen、Po Hu、Mohammed J.Zaki
展开 >
School of Computer Science,Central China Normal University,Wuhan 430079,China