Many convolutional neural network (CNN) models have achieved great success in many fields. The networks get deeper and deeper. However, is each layer non-trivial in networks? To answer these questions, we propose to replace the convolution kernels with zeros. We compare these results with baseline and show that we can reach similar or even same performances. Although convolution kernels are the cores of networks,we demonstrate that some are trivial and that these layers are regular.