首页|图对比学习研究进展

图对比学习研究进展

扫码查看
图对比学习可以提取无标注数据自身信息作为自监督信号指导模型训练,并帮助缓解图神经网络对标签数据的依赖及结构不公平等问题,已成为图神经网络领域的研究热点.本文从数据增广方式、样本对构造、对比学习粒度3个方面对现有图对比学习研究进行了深入探讨,分析了已有不同图对比学习研究方法各自的优点与不足.在此基础上,指出了现有图对比学习研究存在的问题,并提出了自适应性图对比学习、上下文图对比学习、动态图对比学习、超图对比学习、因果推断图对比学习、无负样本图对比学习及基于大语言模型的图对比学习等未来图对比学习的研究方向.
Graph Contrastive Learning Research Progress
Graph Contrastive Learning(GCL)can extract the self-information of unlabeled data as a self-supervised signal to guide model training,and help alleviate the dependence on labeled data and structural unfairness of Graph Neural Network.It has become a research hotspot in the field of Graph Neural Network.This paper delves into the existing research on GCL from three aspects:data augmentation methods,sample pair construction,and contrastive learning granularity.It analyzes the advantages and disadvantages of different existing GCL methods.On this basis,the existing problems in GCL research were pointed out,and future research directions for GCL were proposed,including Adaptive GCL,Context GCL,Dynamic GCL,Hypergraph Contrastive Learning,Causal Inference GCL,Non-negative sample GCL,and GCL based on Large Language Models.

graph contrastive learningresearch progressdata augmentationsample pairscompasison granularity

吴国栋、吴贞畅、王雪妮、胡全兴、秦辉

展开 >

安徽农业大学信息与人工智能学院,合肥 230036

图对比学习 研究进展 数据增广 样本对 对比粒度

2025

小型微型计算机系统
中国科学院沈阳计算技术研究所

小型微型计算机系统

北大核心
影响因子:0.564
ISSN:1000-1220
年,卷(期):2025.46(1)