A Node Classification Method Based on Graph Attention and Improved Transformer
Currently,graph Transformers mainly add auxiliary modules in the traditional Transformer framework to model graph data.However,these methods have not improved the original Transformer architecture.Their data modeling accuracy needs to be further enhanced.Thus,this paper suggests a node classification method based on graph attention and improved Transformer.In the proposed framework,a topology enhancement based node embedding is constructed for graph structure reinforcement learning.Then,a secondary mask based multi-head attention is developed for aggregation and up-date.Finally,pre-Norm and skip connection are introduced to improve the interlayer structure of Transformer,which can avoid the over-smoothing problem caused by feature convergence.Experimental results demonstrate that compared to 6 typ-ical baseline models,our method is able to achieve optimal evaluation results on all different indicators.Moreover,it can si-multaneously handle the node classification task for both small and medium datasets and comprehensively improve the clas-sification performance.