Quantum RNNs and LSTMs Through Entangling and Disentangling Power of Unitary Transformations

Abstract
In this paper, we discuss how quantum recurrent neural networks (RNNs) and their enhanced version, long short-term memory (LSTM) networks, can be modeled using the core ideas presented in Ref.[1], where the entangling and disentangling power of unitary transformations is investigated. In particular, we interpret entangling and disentangling power as information retention and forgetting mechanisms in LSTMs. Therefore, entanglement becomes a key component of the optimization (training) process. We believe that, by leveraging prior knowledge of the entangling power of unitaries, the proposed quantum-classical framework can guide and help to design better-parameterized quantum circuits for various real-world applications.
View on arXiv@article{daskin2025_2505.06774, title={ Quantum RNNs and LSTMs Through Entangling and Disentangling Power of Unitary Transformations }, author={ Ammar Daskin }, journal={arXiv preprint arXiv:2505.06774}, year={ 2025 } }
Comments on this paper