首页|基于图结构增强的图神经网络方法

基于图结构增强的图神经网络方法

扫码查看
针对图卷积网络(GCNs)在面对低同质性的图结构时性能骤降问题,提出了一种新颖的基于图结构增强的图神经网络方法,用于学习改善的图节点表示。首先将节点信息通过消息传播和聚合,得到节点的初始表示;然后计算节点表示的相似性度量,得到图的同质结构;最后融合图的原始结构和同质结构进行节点的信息传递得到节点表示用于下游任务。结果表明:在6个公开的数据集上,所提算法在节点分类的多个指标上均优于对比算法,特别是在同质性较低的4个数据集上,所提算法的准确度(ACC)分数分别超过最高基准5。53%、6。87%、3。08%、4。00%,宏平均(F1)值分别超过最高基准5。75%、8。06%、6。46%、5。61%,获得了远高于基准的优越表现,表明所提方法成功改善了图数据的结构,验证了该算法对图结构优化的有效性。
Graph neural network method based on graph structure enhancement
In response to the problem of sudden performance degradation in graph convolutional networks(GCNs)facing low homogeneity graph structures,a novel graph structure enhancement method is proposed for learning im-proved graph node representations.Firstly,the node information is propagated and aggregated by messages to obtain an initial representation of the nodes.Then the similarity metric of the node representation is calculated to obtain the homogeneous structure of the graph.Finally,the original structure of the graph and the homogeneous structure are fused for node information transfer to obtain the node representation for downstream tasks.The re-sults show that the proposed algorithm outperforms the comparison algorithm in several metrics of node classifica-tion on six publicly available datasets,especially on the four datasets with low homogeneity,the ACC scores of the proposed algorithm exceed the highest benchmark by 5.53%,6.87%,3.08%and 4.00%,and the Fl values exceed the highest benchmark by 5.75%,8.06%,6.46%and 5.61%,respectively,obtaining superior perfor-mance well above the benchmark,indicating that the proposed method successfully improves the structure of graph data and verifies the effectiveness of the algorithm for graph structure optimization.

graph structure enhancementsimilarity measuregraph convolution networknode classification

张芳、单万锦、王雯

展开 >

天津工业大学生命科学学院,天津 300387

天津工业大学天津市光电检测技术与系统重点实验室,天津 300387

天津工业大学电子与信息工程学院,天津 300387

图结构增强 相似性度量 图卷积网络 节点分类

国家自然科学基金资助项目

61702296

2024

天津工业大学学报
天津工业大学

天津工业大学学报

CSTPCD北大核心
影响因子:0.404
ISSN:1671-024X
年,卷(期):2024.43(3)
  • 28