Abstract:Quadratic Unconstrained Binary Optimization (QUBO) sits at the heart of many industries and academic fields such as logistics, supply chain, finance, pharmaceutical science, chemistry, IT, and energy sectors, among others. These problems typically involve optimizing a large number of binary variables, which makes finding exact solutions exponentially more difficult. Consequently, most QUBO problems are classified as NP-hard. To address this challenge, we developed a powerful feedforward neural network (FNN) optimizer for arbitrary QUBO problems. In this work, we demonstrate that the FNN optimizer can provide high-quality approximate solutions for large problems, including dense 80-variable weighted MaxCut and random QUBOs, achieving an average accuracy of over 99% in less than 1.1 seconds on an 8-core CPU. Additionally, the FNN optimizer outperformed the Gurobi optimizer by 72% on 200-variable random QUBO problems within a 100-second computation time limit, exhibiting strong potential for real-time optimization tasks. Building on this model, we explored the novel approach of integrating FNNs with a quantum annealer-based activation function to create a quantum-classical encoder-decoder (QCED) optimizer, aiming to further enhance the performance of FNNs in QUBO optimization.
Abstract:The Quantum Convolutional Neural Network (QCNN) is a quantum circuit model inspired by the architecture of Convolutional Neural Networks (CNNs). The success of CNNs is largely due to its ability to learn high level features from raw data rather than requiring manual feature design. Neural Architecture Search (NAS) continues this trend by learning network architecture, alleviating the need for its manual construction and have been able to generate state of the art models automatically. Search space design is a crucial step in NAS and there is currently no formal framework through which it can be achieved for QCNNs. In this work we provide such a framework by utilizing techniques from NAS to create an architectural representation for QCNNs that facilitate search space design and automatic model generation. This is done by specifying primitive operations, such as convolutions and pooling, in such a way that they can be dynamically stacked on top of each other to form different architectures. This way, QCNN search spaces can be created by controlling the sequence and hyperparameters of stacked primitives, allowing the capture of different design motifs. We show this by generating QCNNs that belong to a popular family of parametric quantum circuits, those resembling reverse binary trees. We then benchmark this family of models on a music genre classification dataset, GTZAN. Showing that alternating architecture impact model performance more than other modelling components such as choice of unitary ansatz and data encoding, resulting in a way to improve model performance without increasing its complexity. Finally we provide an open source python package that enable dynamic QCNN creation by system or hand, based off the work presented in this paper, facilitating search space design.