Abstract:Hyperparameter optimization is a challenging problem in developing deep neural networks. Decision of transfer layers and trainable layers is a major task for design of the transfer convolutional neural networks (CNN). Conventional transfer CNN models are usually manually designed based on intuition. In this paper, a genetic algorithm is applied to select trainable layers of the transfer model. The filter criterion is constructed by accuracy and the counts of the trainable layers. The results show that the method is competent in this task. The system will converge with a precision of 97% in the classification of Cats and Dogs datasets, in no more than 15 generations. Moreover, backward inference according the results of the genetic algorithm shows that our method can capture the gradient features in network layers, which plays a part on understanding of the transfer AI models.
Abstract:Generalization is an important feature of neural network, and there have been many studies on it. Recently, with the development of quantum compu-ting, it brings new opportunities. In this paper, we studied a class of quantum neural network constructed by quantum gate. In this model, we mapped the feature data to a quantum state in Hilbert space firstly, and then implement unitary evolution on it, in the end, we can get the classification result by im-plement measurement on the quantum state. Since all the operations in quan-tum neural networks are unitary, the parameters constitute a hypersphere of Hilbert space. Compared with traditional neural network, the parameter space is flatter. Therefore, it is not easy to fall into local optimum, which means the quantum neural networks have better generalization. In order to validate our proposal, we evaluated our model on three public datasets, the results demonstrated that our model has better generalization than the classical neu-ral network with the same structure.