NHCL:A Hypergraph Contrastive Learning Based on Native Structure Augmentation
Hypergraph contrastive learning based on self-supervised learning has been extensively studied.However,current hypergraph contrastive learning mostly relies on traditional data augmentation methods used in graph representation learning,which less consider the native structure of hypergraph,and failing to fully exploit higher-order relationships within hypergraph.To address this limitation,a series of data augmentation operations based on the native structure of the hypergraph are proposed,that is,perturbing the hyperedges and nodes in the hypergraph.By studying the inclusion,combination,and intersection relationships between hyperedges and the interactive re-lationships between nodes,we propose a set of fundamental perturbation operations tailored for hyperedges and nodes,and these basic op-erations between hyperedges and nodes are combined to help the model learn.By using basic data enhancement operations and their com-binations,positive and negative sample pairs are generated for hypergraph comparison learning.We employ hypergraph neural networks to learn their representations and encode them while guiding model training with a loss function,which helps the model better capture high-order relationships within hypergraphs.To validate the effectiveness of the proposed method,we conduct node classification experiments on 12 commonly used hypergraph benchmark datasets,including Cora-CA,PubMed,and ModelNet40.The experimental results show that the proposed method outperforms existing two hypergraph self-supervised methods like Self and Con,hypergraph contrastive learning methods like HyperGCL and TriCL,achieving 2%to 7%improvement in node classification accuracy.