首页|Re-quantization based binary graph neural networks

Re-quantization based binary graph neural networks

扫码查看
Binary neural networks have become a promising research topic due to their advantages of fast inference speed and low energy consumption.However,most existing studies focus on binary convolutional neural networks,while less attention has been paid to binary graph neural networks.A common drawback of existing studies on binary graph neural networks is that they still include lots of inefficient full-precision operations in multiplying three matrices and are therefore not efficient enough.In this paper,we propose a novel method,called re-quantization-based binary graph neural networks(RQBGN),for binarizing graph neural networks.Specifically,re-quantization,a necessary procedure contributing to the further reduction of superfluous inefficient full-precision operations,quantizes the results of multiplication between any two matrices during the process of multiplying three matrices.To address the challenges introduced by re-quantization,in RQBGN we first study the impact of different computation orders to find an effective one and then introduce a mixture of experts to increase the model capacity.Experiments on five benchmark datasets show that performing re-quantization in different computation orders significantly impacts the performance of binary graph neural network models,and RQBGN can outperform other baselines to achieve state-of-the-art performance.

graph neural networksbinary neural networksmixture of expertscomputation-efficient algo-rithms

Kai-Lang YAO、Wu-Jun LI

展开 >

National Key Laboratory for Novel Software Technology,Department of Computer Science and Technology,Nanjing University,Nanjing 210046,China

National Key R&D Program of ChinaNational Natural Science Foundation of ChinaNational Natural Science Foundation of ChinaFundamental Research Funds for the Central Universities

2020YFA07139016192100662192783020214380108

2024

中国科学:信息科学(英文版)
中国科学院

中国科学:信息科学(英文版)

CSTPCDEI
影响因子:0.715
ISSN:1674-733X
年,卷(期):2024.67(7)