电子学报2024,Vol.52Issue(8) :2799-2810.DOI:10.12263/DZXB.20230515

基于图注意力和改进Transformer的节点分类方法

A Node Classification Method Based on Graph Attention and Improved Transformer

李鑫 陆伟 马召祎 朱攀 康彬
电子学报2024,Vol.52Issue(8) :2799-2810.DOI:10.12263/DZXB.20230515

基于图注意力和改进Transformer的节点分类方法

A Node Classification Method Based on Graph Attention and Improved Transformer

李鑫 1陆伟 1马召祎 1朱攀 1康彬1
扫码查看

作者信息

  • 1. 南京邮电大学物联网学院,江苏南京 210003
  • 折叠

摘要

当前,图Transformer主要在传统Transformer框架中附加辅助模块达到对图数据进行建模的目的.然而,此类方法并未改进Transformer原有体系结构,数据建模精度还有待进一步提高.基于此,本文提出一种基于图注意力和改进Transformer的节点分类方法.该方法构建基于拓扑特征增强的节点嵌入进行图结构强化学习,并且设计基于二级掩码的多头注意力机制对节点特征进行聚合及更新,最后引入归一前置及跳跃连接改进Transformer层间结构,避免节点特征趋同引起的过平滑问题.实验结果表明,相较于6类基线模型,该方法在不同性能指标上均可获得最优评估结果,且能同时兼顾小规模和中规模数据集的节点分类任务,实现分类性能的全面提升.

Abstract

Currently,graph Transformers mainly add auxiliary modules in the traditional Transformer framework to model graph data.However,these methods have not improved the original Transformer architecture.Their data modeling accuracy needs to be further enhanced.Thus,this paper suggests a node classification method based on graph attention and improved Transformer.In the proposed framework,a topology enhancement based node embedding is constructed for graph structure reinforcement learning.Then,a secondary mask based multi-head attention is developed for aggregation and up-date.Finally,pre-Norm and skip connection are introduced to improve the interlayer structure of Transformer,which can avoid the over-smoothing problem caused by feature convergence.Experimental results demonstrate that compared to 6 typ-ical baseline models,our method is able to achieve optimal evaluation results on all different indicators.Moreover,it can si-multaneously handle the node classification task for both small and medium datasets and comprehensively improve the clas-sification performance.

关键词

节点分类/图注意力网络/Transformer/二级掩码/层间残差/多头注意力

Key words

node classification/graph attention network/Transformer/secondary mask/interlayer residual/multi-head attention

引用本文复制引用

基金项目

国家自然科学基金(62171232)

江苏省重点研发计划(BE2020729)

出版年

2024
电子学报
中国电子学会

电子学报

CSTPCDCSCD北大核心
影响因子:1.237
ISSN:0372-2112
参考文献量2
段落导航相关论文