A dual-view contrastive learning-guided multi-behavior recommendation method
Multi-behavior recommendation(MBR)typically utilizes various types of user interaction behaviors(such as browsing,adding to cart,and purchasing)to learn user preferences for the target be-havior(i.e.,purchasing).Due to the impact of sparse supervision signals,existing MBR models often suffer from poor recommendation performance.Recently,contrastive learning has achieved success in mining auxiliary supervision signals from raw data itself.Inspired by this,we propose a dual-view con-trastive learning-guided method to enhance MBR.Firstly,we construct two views that can capture both local and high-order structural information using multi-behavior interaction data.Then,we design two different view encoders to learn user and item embeddings on these complementary views.Finally,we use cross-view collaborative contrastive learning to mutually supervise and learn better embeddings.Ex-perimental results on two real-world datasets demonstrate that our proposed method significantly out-performs baseline methods.