Neural Networks2022,Vol.15210.DOI:10.1016/j.neunet.2022.05.001

Embedding graphs on Grassmann manifold

Zheng, Xuebin Wang, Yu Guang Li, Ming Gao, Junbin Zhou, Bingxin
Neural Networks2022,Vol.15210.DOI:10.1016/j.neunet.2022.05.001

Embedding graphs on Grassmann manifold

Zheng, Xuebin 1Wang, Yu Guang 2Li, Ming 3Gao, Junbin 1Zhou, Bingxin1
扫码查看

作者信息

  • 1. University of Sydney,Univ Sydney
  • 2. Inst Nat Sci,Shanghai Jiao Tong Univ
  • 3. Key Lab Intelligent Educ Technol & Applicat Zhejia,Zhejiang Normal Univ
  • 折叠

Abstract

Learning efficient graph representation is the key to favorably addressing downstream tasks on graphs, such as node or graph property prediction. Given the non-Euclidean structural property of graphs, preserving the original graph data's similarity relationship in the embedded space needs specific tools and a similarity metric. This paper develops a new graph representation learning scheme, namely EGG, which embeds approximated second-order graph characteristics into a Grassmann manifold. The proposed strategy leverages graph convolutions to learn hidden representations of the corresponding subspace of the graph, which is then mapped to a Grassmann point of a low dimensional manifold through truncated singular value decomposition (SVD). The established graph embedding approximates denoised correlationship of node attributes, as implemented in the form of a symmetric matrix space for Euclidean calculation. The effectiveness of EGG is demonstrated using both clustering and classification tasks at the node level and graph level. It outperforms baseline models on various benchmarks. (C) 2022 Elsevier Ltd. All rights reserved.

Key words

Grassmann manifold/Graph neural network/Projection embedding/Subspace clustering/GEOMETRY

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
参考文献量69
段落导航相关论文