中国科学:信息科学(英文版)2024,Vol.67Issue(4) :297-309.DOI:10.1007/s11432-023-3879-7

Quantum self-attention neural networks for text classification

Guangxi LI Xuanqiang ZHAO Xin WANG
中国科学:信息科学(英文版)2024,Vol.67Issue(4) :297-309.DOI:10.1007/s11432-023-3879-7

Quantum self-attention neural networks for text classification

Guangxi LI 1Xuanqiang ZHAO 2Xin WANG3
扫码查看

作者信息

  • 1. Institute for Quantum Computing,Baidu Research,Beijing 100193,China;Centre for Quantum Software and Information,University of Technology Sydney,Sydney NSW 2007,Australia
  • 2. Institute for Quantum Computing,Baidu Research,Beijing 100193,China;QICI Quantum Information and Computation Initiative,Department of Computer Science,The University of Hong Kong,Hong Kong 999077,China
  • 3. Institute for Quantum Computing,Baidu Research,Beijing 100193,China;Thrust of Artificial Intelligence,Information Hub,Hong Kong University of Science and Technology(Guangzhou),Guangzhou 511453,China
  • 折叠

Abstract

An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence,including natural language processing(NLP).Although some efforts based on syntactic analysis have opened the door to research in quantum NLP(QNLP),limitations such as heavy syntactic preprocessing and syntax-dependent network architecture make them impracticable on larger and real-world data sets.In this paper,we propose a new simple network architecture,called the quantum self-attention neural network(QSANN),which can compensate for these limitations.Specifically,we introduce the self-attention mechanism into quantum neural networks and then utilize a Gaussian projected quantum self-attention serving as a sensible quantum version of self-attention.As a result,QSANN is effective and scalable on larger data sets and has the desirable property of being implementable on near-term quantum devices.In particular,our QSANN outperforms the best existing QNLP model based on syntactic analysis as well as a simple classical self-attention neural network in numerical experiments of text classification tasks on public data sets.We further show that our method exhibits robustness to low-level quantum noises and showcases resilience to quantum neural network architectures.

Key words

quantum neural networks/self-attention/natural language processing/text classification/pa-rameterized quantum circuits

引用本文复制引用

基金项目

Guangdong Provincial Quantum Science Strategic Initiative(GDZX2303007)

Quantum Science Center of Guangdong-Hong Kong-Macao Greater Bay Area()

Baidu-UTS AI Meets Quantum project()

国家留学基金委项目(201806070139)

Australian Research Council Project(DP180100691)

Startup Fund from The Hong Kong University of Science and Technology(Guangzhou)(G0101000151)

Innovation Program for Quantum Science and Technology(2021ZD0302901)

Education Bureau of Guangzhou Municipality()

出版年

2024
中国科学:信息科学(英文版)
中国科学院

中国科学:信息科学(英文版)

CSTPCDEI
影响因子:0.715
ISSN:1674-733X
参考文献量68
段落导航相关论文