Analyzing Convergence in Quantum Neural Networks: Deviations from Neural Tangent Kernels

By X. You et al
Published on March 28, 2023
Read the original document by opening this link in a new tab.

Table of Contents

1 Introduction
2 Preliminaries
3 Deviations of QNN Dynamics from NTK

Summary

A quantum neural network (QNN) is a parameterized mapping efficiently implementable on near-term Noisy Intermediate-Scale Quantum (NISQ) computers. Contrary to popular belief, the dynamics of QNNs show deviations from the tangent kernel regression derived at random initialization. This work focuses on the convergence of QNNs, providing theoretical insights into the dynamics and convergence rates. By studying over-parameterized QNNs, the authors show a non-negligible deviation from kernel regression dynamics due to the unitarity of quantum operations. The study reveals at-most sublinear convergence for QNNs with Pauli measurements, challenging the explanatory power of traditional kernel regression dynamics. The paper also identifies the true asymptotic dynamics of QNN training, highlighting the importance of measurement range in QNN convergence. The findings contribute to a deeper understanding of the convergence behavior of quantum neural networks.
×
This is where the content will go.