Bnl Quantum Long Short-Term Memory

By Samuel Yen-Chi Chen et al
Published on Sept. 4, 2020
Read the original document by opening this link in a new tab.

Table of Contents

Abstract
I. Introduction
II. Classical Machine Learning
III. Variational Quantum Circuits
IV. Quantum LSTM
V. Stack All the Blocks

Summary

The document discusses the integration of quantum and classical machine learning techniques to propose a hybrid quantum-classical model of Long Short-Term Memory (LSTM) called QLSTM. It demonstrates the effectiveness of QLSTM in learning temporal data and its potential applications in quantum computing. The paper introduces Variational Quantum Circuits (VQCs) and explains how they are used in the proposed QLSTM framework. The architecture of QLSTM consists of multiple VQC blocks stacked together to process input data and learn sequential dependencies. The document provides detailed mathematical formulations and explanations of each component in the QLSTM cell. Overall, the paper explores the feasibility of leveraging quantum machine learning for modeling temporal data and highlights the advantages of QLSTM over classical LSTM models.
×
This is where the content will go.