首页|NHCL:一种基于原生结构增强的超图对比学习

NHCL:一种基于原生结构增强的超图对比学习

扫码查看
基于自监督学习的超图对比学习已被广泛研究,然而,当前超图对比学习大多采用传统图表示学习中的数据增强方法,较少考虑超图的原生结构,并没有充分利用超图中的高阶关系。为了解决这一局限性,提出了一系列基于超图原生结构的数据增强操作,即针对超图中的超边和节点进行扰动。通过对超边之间的包含、组合及相交等关系和节点之间交互关系的研究,提出了一系列面向超边和节点的基本扰动操作,并在此基础上对面向超边和节点之间的基本操作进行了组合,帮助模型进行学习。通过使用基本数据增强操作及其组合,生成用于超图对比学习模型进行学习的正负样本对,使用超图神经网络学习其表征信息并进行编码,通过损失函数指导模型训练,从而帮助模型学习到超图中的高阶关系。为了验证该方法的有效性,对Cora-CA、PubMed和ModelNet40 等12 个常用的超图基准数据集进行了节点分类实验。实验结果表明,相比于现有两个超图自监督方法Self和Con、超图对比学习方法HyperGCL和TriCL,该方法在节点分类准确率上提升了2%~7%。
NHCL:A Hypergraph Contrastive Learning Based on Native Structure Augmentation
Hypergraph contrastive learning based on self-supervised learning has been extensively studied.However,current hypergraph contrastive learning mostly relies on traditional data augmentation methods used in graph representation learning,which less consider the native structure of hypergraph,and failing to fully exploit higher-order relationships within hypergraph.To address this limitation,a series of data augmentation operations based on the native structure of the hypergraph are proposed,that is,perturbing the hyperedges and nodes in the hypergraph.By studying the inclusion,combination,and intersection relationships between hyperedges and the interactive re-lationships between nodes,we propose a set of fundamental perturbation operations tailored for hyperedges and nodes,and these basic op-erations between hyperedges and nodes are combined to help the model learn.By using basic data enhancement operations and their com-binations,positive and negative sample pairs are generated for hypergraph comparison learning.We employ hypergraph neural networks to learn their representations and encode them while guiding model training with a loss function,which helps the model better capture high-order relationships within hypergraphs.To validate the effectiveness of the proposed method,we conduct node classification experiments on 12 commonly used hypergraph benchmark datasets,including Cora-CA,PubMed,and ModelNet40.The experimental results show that the proposed method outperforms existing two hypergraph self-supervised methods like Self and Con,hypergraph contrastive learning methods like HyperGCL and TriCL,achieving 2%to 7%improvement in node classification accuracy.

hypergraph contrastive learningdata augmentationhypergraph native structurehypergraph neural networksself-supervised learning

刘宇、侯阿龙、方舒言、高峰、张晓龙

展开 >

武汉科技大学 计算机科学与技术学院,湖北 武汉 430072

湖北省智能信息处理与实时工业系统重点实验室,湖北 武汉 430072

超图对比学习 数据增强 超图原生结构 超图神经网络 自监督学习

科技创新2030"新一代人工智能"重大项目国家自然科学基金

2020AAA010850162261023

2024

计算机技术与发展
陕西省计算机学会

计算机技术与发展

CSTPCD
影响因子:0.621
ISSN:1673-629X
年,卷(期):2024.34(9)