Quantum mixed-state self-attention network

Neural Netw. 2025 Jan 8:185:107123. doi: 10.1016/j.neunet.2025.107123. Online ahead of print.

Abstract

Attention mechanisms have revolutionized natural language processing. Combining them with quantum computing aims to further advance this technology. This paper introduces a novel Quantum Mixed-State Self-Attention Network (QMSAN) for natural language processing tasks. Our model leverages quantum computing principles to enhance the effectiveness of self-attention mechanisms. QMSAN uses a quantum attention mechanism based on mixed state, allowing for direct similarity estimation between queries and keys in the quantum domain. This approach leads to more effective attention coefficient calculations. We also propose an innovative quantum positional encoding scheme, implemented through fixed quantum gates within the circuit, improving the model's ability to capture sequence information without additional qubit resources. In numerical experiments of text classification tasks on public datasets, QMSAN outperforms Quantum Self-Attention Neural Network (QSANN). Furthermore, we demonstrate QMSAN's robustness in different quantum noise environments, highlighting its potential for near-term quantum devices.

Keywords: Quantum machine learning; Quantum self-attention mechanism; Self-attention mechanism; Text classification.