Artificial neural networks which are inspired from the learning mechanism of brain have achieved great successes in many problems, especially those with deep layers. In this paper, we propose a nucleus neural network (NNN) and corresponding connecting architecture learning method. In a nucleus, there are no regular layers, i.e., a neuron may connect to all the neurons in the nucleus. This type of architecture gets rid of layer limitation and may lead to more powerful learning capability. It is crucial to determine the connections between them given numerous neurons. Based on the principle that more relevant input and output neuron pair deserves higher connecting density, we propose an efficient architecture learning model for the nucleus. Moreover, we improve the learning method for connecting weights and biases given the optimized architecture. We find that this novel architecture is robust to irrelevant components in test data. So we reconstruct a new dataset based on the MNIST dataset where the types of digital backgrounds in training and test sets are different. Experiments demonstrate that the proposed learner achieves significant improvement over traditional learners on the reconstructed data set.