Quantum Self-Attention Neural Networks for Text Classification

By Guangxi Li et al
Published on Sept. 29, 2023
Read the original document by opening this link in a new tab.

Table of Contents

I. INTRODUCTION
II. METHOD
A. Quantum Self-Attention Layer
B. Loss Function
III. ANALYTICAL GRADIENTS
IV. CONCLUSION

Summary

An emerging direction of quantum computing is to establish meaningful quantum applications in various fields of artificial intelligence, including natural language processing (NLP). In this paper, a new simple network architecture, called the quantum self-attention neural network (QSANN), is proposed to address limitations in current Quantum NLP (QNLP) models. QSANN introduces the self-attention mechanism into quantum neural networks and utilizes a Gaussian projected quantum self-attention to improve performance on text classification tasks. Experimental results show that QSANN outperforms existing QNLP models and classical self-attention neural networks. The proposed method is resilient to quantum noises and quantum neural network architectures, offering potential quantum advantages.
×
This is where the content will go.