首页|On the Significance of Graph Neural Networks With Pretrained Transformers in Content-Based Recommender Systems for Academic Article Classification
On the Significance of Graph Neural Networks With Pretrained Transformers in Content-Based Recommender Systems for Academic Article Classification
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
Wiley
Recommender systems are tools for interacting with large and complex information spaces by providing a personalised view of such spaces, prioritising items that are likely to be of interest to the user. In addition, they serve as a significant tool in academic research, helping authors select the most appropriate journals for their academic articles. This paper presents a comprehensive study of various journal recommender systems, focusing on the synergy of graph neural networks (GNNs) with pretrained transformers for enhanced text classification. Furthermore, we propose a content-based journal recommender system that combines a pretrained Transformer with a Graph Attention Network (GAT) using title, abstract and keywords as input data. The proposed architecture enhances text representation by forming graphs from the Transformers' hidden states and attention matrices, excluding padding tokens. Our findings highlight that this integration improves the accuracy of the journal recommendations and reduces the transformer oversmoothing problem, with RoBERTa outperforming BERT models. Furthermore, excluding padding tokens from graph construction reduces training time by 8%–15%. Furthermore, we offer a publicly available dataset comprising 830,978 articles.