Abstract:Solving the ground state and the ground-state properties of quantum many-body systems is generically a hard task for classical algorithms. For a family of Hamiltonians defined on an $m$-dimensional space of physical parameters, the ground state and its properties at an arbitrary parameter configuration can be predicted via a machine learning protocol up to a prescribed prediction error $\varepsilon$, provided that a sample set (of size $N$) of the states can be efficiently prepared and measured. In a recent work [Huang et al., Science 377, eabk3333 (2022)], a rigorous guarantee for such an generalization was proved. Unfortunately, an exponential scaling, $N = m^{ {\cal{O}} \left(\frac{1}{\varepsilon} \right) }$, was found to be universal for generic gapped Hamiltonians. This result applies to the situation where the dimension of the parameter space is large while the scaling with the accuracy is not an urgent factor, not entering the realm of more precise learning and prediction. In this work, we consider an alternative scenario, where $m$ is a finite, not necessarily large constant while the scaling with the prediction error becomes the central concern. By exploiting physical constraints and positive good kernels for predicting the density matrix, we rigorously obtain an exponentially improved sample complexity, $N = \mathrm{poly} \left(\varepsilon^{-1}, n, \log \frac{1}{\delta}\right)$, where $\mathrm{poly}$ denotes a polynomial function; $n$ is the number of qubits in the system, and ($1-\delta$) is the probability of success. Moreover, if restricted to learning ground-state properties with strong locality assumptions, the number of samples can be further reduced to $N = \mathrm{poly} \left(\varepsilon^{-1}, \log \frac{n}{\delta}\right)$. This provably rigorous result represents a significant improvement and an indispensable extension of the existing work.
Abstract:Feynman path integrals provide an elegant, classically-inspired representation for the quantum propagator and the quantum dynamics, through summing over a huge manifold of all possible paths. From computational and simulational perspectives, the ergodic tracking of the whole path manifold is a hard problem. Machine learning can help, in an efficient manner, to identify the relevant subspace and the intrinsic structure residing at a small fraction of the vast path manifold. In this work, we propose the concept of Feynman path generator, which efficiently generates Feynman paths with fixed endpoints from a (low-dimensional) latent space, by targeting a desired density of paths in the Euclidean space-time. With such path generators, the Euclidean propagator as well as the ground state wave function can be estimated efficiently for a generic potential energy. Our work leads to a fresh approach for calculating the quantum propagator, paves the way toward generative modelling of Feynman paths, and may also provide a future new perspective to understand the quantum-classical correspondence through deep learning.