北京大学学报(自然科学版)2024,Vol.60Issue(1) :34-42.DOI:10.13209/j.0479-8023.2023.074

基于交叉注意力多源数据增强的情境感知查询建议方法

A Context-Aware Query Suggestion Method Based on Multi-source Data Augmentation through Cross-Attention

张乃洲 曹薇
北京大学学报(自然科学版)2024,Vol.60Issue(1) :34-42.DOI:10.13209/j.0479-8023.2023.074

基于交叉注意力多源数据增强的情境感知查询建议方法

A Context-Aware Query Suggestion Method Based on Multi-source Data Augmentation through Cross-Attention

张乃洲 1曹薇1
扫码查看

作者信息

  • 1. 河南财经政法大学计算机与信息工程学院, 郑州 450046
  • 折叠

摘要

当前基于神经网络模型的查询建议研究往往单独采用查询日志会话中的查询序列作为训练数据,但由于查询本身缺乏句法关系,甚至缺失语义,导致神经网络模型不能充分挖掘和推理查询序列中各种词或概念之间语义关系.针对这一问题,提出一种基于交叉注意力多源数据增强(MDACA)的 Transformer 模型框架,用于生成情境感知的查询建议.采用基于 Transformer 的编码器-解码器模型,利用交叉注意力机制,融合了查询层、文档语义层以及全局查询建议信息.实验结果表明,与目前方法相比,该方法能生成具有更高相关性的情境感知查询建议.

Abstract

Most existing neural network-based approaches for query suggestion use solely query sequences in query logs as training data.However,these methods cannot fully mine and infer all kinds of semantic relationships among words or concepts from query sequences because queries in query sequences inherently suffer from a lack of syntactic relation,even a loss of semantics.To solve this problem,this paper proposes a new neural network model based on multi-source data augmentation through cross-attention(MDACA)for generating context-aware query suggestions.Proposed model adopts a Transformer-based encoder-decoder model that incorporates document-level semantics and global query suggestions into query-level information through cross-attention.The experimental results show that in contrast to the current suggestion models,the proposed model can generate context-aware query suggestions with higher relevance.

关键词

查询建议/数据增强/交叉注意力/情境感知/Transformer模型

Key words

query suggestion/data augmentation/cross-attention/context-aware/Transformer model

引用本文复制引用

基金项目

国家自然科学基金(62072156)

出版年

2024
北京大学学报(自然科学版)
北京大学

北京大学学报(自然科学版)

CSTPCDCSCD北大核心
影响因子:0.785
ISSN:0479-8023
参考文献量22
段落导航相关论文