Abstract:Classical Physics-informed neural networks (PINNs) approximate solutions to PDEs with the help of deep neural networks trained to satisfy the differential operator and the relevant boundary conditions. We revisit this idea in the quantum computing realm, using parameterised random quantum circuits as trial solutions. We further adapt recent PINN-based techniques to our quantum setting, in particular Gaussian smoothing. Our analysis concentrates on the Poisson, the Heat and the Hamilton-Jacobi-Bellman equations, which are ubiquitous in most areas of science. On the theoretical side, we develop a complexity analysis of this approach, and show numerically that random quantum networks can outperform more traditional quantum networks as well as random classical networks.
Abstract:This article provides an understanding of Natural Language Processing techniques in the framework of financial regulation, more specifically in order to perform semantic matching search between rules and policy when no dataset is available for supervised learning. We outline how to outperform simple pre-trained sentences-transformer models using freely available resources and explain the mathematical concepts behind the key building blocks of Natural Language Processing.
Abstract:Universal approximation theorems are the foundations of classical neural networks, providing theoretical guarantees that the latter are able to approximate maps of interest. Recent results have shown that this can also be achieved in a quantum setting, whereby classical functions can be approximated by parameterised quantum circuits. We provide here precise error bounds for specific classes of functions and extend these results to the interesting new setup of randomised quantum circuits, mimicking classical reservoir neural networks. Our results show in particular that a quantum neural network with $\mathcal{O}(\varepsilon^{-2})$ weights and $\mathcal{O} (\lceil \log_2(\varepsilon^{-1}) \rceil)$ qubits suffices to achieve accuracy $\varepsilon>0$ when approximating functions with integrable Fourier transform.
Abstract:Generative Adversarial Networks are becoming a fundamental tool in Machine Learning, in particular in the context of improving the stability of deep neural networks. At the same time, recent advances in Quantum Computing have shown that, despite the absence of a fault-tolerant quantum computer so far, quantum techniques are providing exponential advantage over their classical counterparts. We develop a fully connected Quantum Generative Adversarial network and show how it can be applied in Mathematical Finance, with a particular focus on volatility modelling.