首页|基于意图的轻量级自注意力序列推荐模型

基于意图的轻量级自注意力序列推荐模型

扫码查看
现有序列推荐模型中自注意力机制计算参数量过大,同时没有充分考虑用户购物意图中的偏好信息.本文提出一种基于意图的轻量级自注意力序列推荐模型.该模型在传统的商品序列编码基础上,引入意图序列编码,进一步挖掘序列间转换模式;同时,为了降低序列中两两商品/意图间自注意力计算复杂度,设计卷积分段采样模块,将用户行为序列和意图序列分为多个片段,即将用户兴趣映射到多个序列片段中,应用自注意力机制捕捉片段间依赖关系,有效减少计算参数量.在MovieLens-1M、Yelp和Amazon-Books这3个公开数据集上进行对比实验,结果表明,相比基线模型,其命中率、归一化折损累计增益和平均倒数排名在MovieLens-1M数据集上提升了5.32%、4.40%和5.51%,在Yelp数据集上提升了30.93%、22.73%和28.84%,在Amazon-Books数据集上提升了7.78%、11.55%和13.98%,充分验证了本文所提模型的有效性.
Intent-based Lightweight Self-Attention Network for Sequential Recommendation
The parameters of the self-attention calculation mechanism in the existing sequence recommendation models are too large,and there is insufficient preference information in the user′s shopping intention.This paper proposes an intent-based light-weight self-attention network for sequential recommendation.On the basis of the traditional product sequence embedding,the model introduces intention sequence embedding to further explore the conversion patterns between sequences.At the same time,in order to reduce the computational complexity of self-attention between pairwise products/intentions in the sequence,a convo-lutional segmentation sampling module is designed to divide the user behavior sequence and intention sequence into multiple seg-ments,mapping user interests to multiple sequence segments.Comparative experiments are conducted on three public datasets,MovieLens-1M,Yelp,and Amazon-Books.Compared with baseline models,the self-attention mechanism is applied to capture the dependency between segments,effectively reducing the number of parameters.The results show that the hit rate,normalized discounted cumulative gain and mean reciprocal ranking are increased by 5.32%,4.40%and 5.51%on the MovieLens-1M data-set,30.93%,22.73%and 28.84%on the Yelp dataset,and 7.78%,11.55%and 13.98%on the Amazon-Books dataset,which verify the effectiveness of the model proposed in this paper.

sequence recommendationintent recommendationconvolution neural networkself-attention mechanisms

何思达、陈平华

展开 >

广东工业大学计算机学院,广东 广州 510006

序列推荐 意图推荐 卷积神经网络 自注意力机制

2024

计算机与现代化
江西省计算机学会 江西省计算技术研究所

计算机与现代化

CSTPCD
影响因子:0.472
ISSN:1006-2475
年,卷(期):2024.(12)