计算机应用与软件2024,Vol.41Issue(10) :177-183.DOI:10.3969/j.issn.1000-386x.2024.10.027

基于归纳学习图卷积和自注意力池化的图分类网络

GRAPH CLASSIFICATION NETWORK BASED ON INDUCTIVE LEARNING GRAPH CONVOLUTION AND SELF-ATTENTION POOLING

倪瑞智 王永平 张晓琳 叶金辉 陶雪晴
计算机应用与软件2024,Vol.41Issue(10) :177-183.DOI:10.3969/j.issn.1000-386x.2024.10.027

基于归纳学习图卷积和自注意力池化的图分类网络

GRAPH CLASSIFICATION NETWORK BASED ON INDUCTIVE LEARNING GRAPH CONVOLUTION AND SELF-ATTENTION POOLING

倪瑞智 1王永平 1张晓琳 1叶金辉 1陶雪晴1
扫码查看

作者信息

  • 1. 内蒙古科技大学信息工程学院 内蒙古包头 014010
  • 折叠

摘要

针对图神经网络在大规模图上的分类表现不佳,无法快速形成未知节点和边的嵌入,并且容易丢失图重要特征等问题.提出一种基于归纳学习和自注意力池化相结合的图分类网络模型,一方面采用改进聚合函数后的归纳式学习方法对图的节点特征形成快速地嵌入,另一方面采用 自注意力池化方法保留图的重要特征,最终采用适于提取大规模图信息的层次化结构框架进行下游图分类任务.实验结果表明,该网络模型在相同的公共数据集下,对比其他图分类模型有2%~10%左右精度的提高.

Abstract

In view of the poor classification performance of graph neural network on large-scale graphs,it is unable to quickly form the embedding of unknown nodes and edges,and it is easy to lose the important features of the graph.A graph classification network model based on inductive learning and self-attention pooling is proposed.On the one hand,the inductive learning method after the improved aggregation function was used to quickly embed the node features of the graph.On the other hand,the self-attention pooling method was used to retain the important features of the graph.A hierarchical framework suitable for extracting large-scale graph information was used for downstream graph classification task.The experimental results show that the accuracy of this method is about 2%~10%higher than that of other graph classification models under the same common data set.

关键词

图神经网络/图分类/自注意力池化/图卷积神经网络

Key words

Graph neural network/Graph classification/Self-attention pooling/Graph convolutional network

引用本文复制引用

基金项目

国家自然科学基金项目(61562065)

内蒙古自治区自然科学基金项目(2019MS06001)

出版年

2024
计算机应用与软件
上海市计算技术研究所 上海计算机软件技术开发中心

计算机应用与软件

CSTPCD北大核心
影响因子:0.615
ISSN:1000-386X
段落导航相关论文