Graph Neural Networks (GNNs) have received significant attention for demonstrating their capability to handle graph data. However, they are difficult to be deployed in resource-limited devices because of model sizes and scalability constraints imposed by the multi-hop data dependency. In addition, real-world graphs usually possess complex structural information and features. Therefore, to improve the applicability of GNNs and fully encode the complicated topological information, Knowledge Distillation on Graphs (KDG) has been introduced to build a smaller but effective model, leading to model compression and performance improvement. Recently, KDG has achieved considerable progress, with many studies proposed. In this survey, we systematically review these works. Specifically, we first introduce the challenges and bases of KDG, then categorize and summarize the existing work of KDG by answering the following three questions: (1) what to distillate, (2) who to whom, and (3) how to distillate. We offer in-depth comparisons and elucidate the strengths and weaknesses of each design. Finally, we share our thoughts on future research directions.
Knowledge distillationgraph neural networks
YIJUN TIAN、SHICHAO PEI、XIANGLIANG ZHANG、CHUXU ZHANG、NITESH V. CHAWLA
展开 >
University of Notre Dame, Notre Dame, United States
University of Massachusetts Boston, Boston, United States