首页|基于图注意力和改进Transformer的节点分类方法

基于图注意力和改进Transformer的节点分类方法

扫码查看
当前,图Transformer主要在传统Transformer框架中附加辅助模块达到对图数据进行建模的目的.然而,此类方法并未改进Transformer原有体系结构,数据建模精度还有待进一步提高.基于此,本文提出一种基于图注意力和改进Transformer的节点分类方法.该方法构建基于拓扑特征增强的节点嵌入进行图结构强化学习,并且设计基于二级掩码的多头注意力机制对节点特征进行聚合及更新,最后引入归一前置及跳跃连接改进Transformer层间结构,避免节点特征趋同引起的过平滑问题.实验结果表明,相较于6类基线模型,该方法在不同性能指标上均可获得最优评估结果,且能同时兼顾小规模和中规模数据集的节点分类任务,实现分类性能的全面提升.
A Node Classification Method Based on Graph Attention and Improved Transformer
Currently,graph Transformers mainly add auxiliary modules in the traditional Transformer framework to model graph data.However,these methods have not improved the original Transformer architecture.Their data modeling accuracy needs to be further enhanced.Thus,this paper suggests a node classification method based on graph attention and improved Transformer.In the proposed framework,a topology enhancement based node embedding is constructed for graph structure reinforcement learning.Then,a secondary mask based multi-head attention is developed for aggregation and up-date.Finally,pre-Norm and skip connection are introduced to improve the interlayer structure of Transformer,which can avoid the over-smoothing problem caused by feature convergence.Experimental results demonstrate that compared to 6 typ-ical baseline models,our method is able to achieve optimal evaluation results on all different indicators.Moreover,it can si-multaneously handle the node classification task for both small and medium datasets and comprehensively improve the clas-sification performance.

node classificationgraph attention networkTransformersecondary maskinterlayer residualmulti-head attention

李鑫、陆伟、马召祎、朱攀、康彬

展开 >

南京邮电大学物联网学院,江苏南京 210003

节点分类 图注意力网络 Transformer 二级掩码 层间残差 多头注意力

国家自然科学基金江苏省重点研发计划

62171232BE2020729

2024

电子学报
中国电子学会

电子学报

CSTPCD北大核心
影响因子:1.237
ISSN:0372-2112
年,卷(期):2024.52(8)
  • 2