首页|GraphFlow+:Exploiting Conversation Flow in Conversational Machine Comprehension with Graph Neural Networks

GraphFlow+:Exploiting Conversation Flow in Conversational Machine Comprehension with Graph Neural Networks

扫码查看
The conversation machine comprehension(MC)task aims to answer questions in the multi-turn conversation for a single passage.However,recent approaches don't exploit information from historical conversations effectively,which results in some references and ellipsis in the current question cannot be recognized.In addition,these methods do not consider the rich semantic relationships between words when reasoning about the passage text.In this paper,we propose a novel model GraphFlow+,which constructs a context graph for each conversation turn and uses a unique recurrent graph neural network(GNN)to model the temporal dependencies between the context graphs of each turn.Specifically,we exploit three different ways to construct text graphs,including the dynamic graph,stat-ic graph,and hybrid graph that combines the two.Our experiments on CoQA,QuAC and DoQA show that the GraphFlow+model can outperform the state-of-the-art approaches.

Conversational machine comprehension(MC)reading comprehensionquestion answeringgraph neural networks(GNNs)natural language processing(NLP)

Jing Hu、Lingfei Wu、Yu Chen、Po Hu、Mohammed J.Zaki

展开 >

School of Computer Science,Central China Normal University,Wuhan 430079,China

Pinterest,San Francisco 94016,USA

Meta,Mountain View 94039,USA

Rensselaer Polytechnic Institute,Troy 12180,USA

展开 >

2024

机器智能研究(英文)
中国科学院自动化所

机器智能研究(英文)

CSTPCDEI
影响因子:0.49
ISSN:2731-538X
年,卷(期):2024.21(2)
  • 40