Intent-based Lightweight Self-Attention Network for Sequential Recommendation
The parameters of the self-attention calculation mechanism in the existing sequence recommendation models are too large,and there is insufficient preference information in the user′s shopping intention.This paper proposes an intent-based light-weight self-attention network for sequential recommendation.On the basis of the traditional product sequence embedding,the model introduces intention sequence embedding to further explore the conversion patterns between sequences.At the same time,in order to reduce the computational complexity of self-attention between pairwise products/intentions in the sequence,a convo-lutional segmentation sampling module is designed to divide the user behavior sequence and intention sequence into multiple seg-ments,mapping user interests to multiple sequence segments.Comparative experiments are conducted on three public datasets,MovieLens-1M,Yelp,and Amazon-Books.Compared with baseline models,the self-attention mechanism is applied to capture the dependency between segments,effectively reducing the number of parameters.The results show that the hit rate,normalized discounted cumulative gain and mean reciprocal ranking are increased by 5.32%,4.40%and 5.51%on the MovieLens-1M data-set,30.93%,22.73%and 28.84%on the Yelp dataset,and 7.78%,11.55%and 13.98%on the Amazon-Books dataset,which verify the effectiveness of the model proposed in this paper.