Convolutional neural networks (CNNs) are a representative class of deep learning algorithms including convolutional computation that perform translation-invariant classification of input data based on their hierarchical architecture. However, classical convolutional neural network learning methods use the steepest descent algorithm for training, and the learning performance is greatly influenced by the initial weight settings of the convolutional and fully connected layers, requiring re-tuning to achieve better performance under different model structures and data. Combining the strengths of the simulated annealing algorithm in global search, we propose applying it to the hyperparameter search process in order to increase the effectiveness of convolutional neural networks (CNNs). In this paper, we introduce SA-CNN neural networks for text classification tasks based on Text-CNN neural networks and implement the simulated annealing algorithm for hyperparameter search. Experiments demonstrate that we can achieve greater classification accuracy than earlier models with manual tuning, and the improvement in time and space for exploration relative to human tuning is substantial.