Channel pruning has demonstrated its effectiveness in compressing ConvNets. In many prior arts, the importance of an output feature map is only determined by its associated filter. However, these methods ignore a small part of weights in the next layer which disappear as the feature map is removed. They ignore the dependency of the weights, so that, a part of weights are pruned without being evaluated. In addition, many pruning methods use only one criterion for evaluation, and find a sweet-spot of pruning structure and accuracy in a trial-and-error fashion, which can be time-consuming. To address the above issues, we proposed a channel pruning algorithm via multi-criteria based on weight dependency, CPMC, which can compress a variety of models efficiently. We design the importance of the feature map in three aspects, including its associated weight value, computational cost and parameter quantity. Use the phenomenon of weight dependency, We get the importance by assessing its associated filter and the corresponding partial weights of the next layer. Then we use global normalization to achieve cross-layer comparison. Our method can compress various CNN models, including VGGNet, ResNet and DenseNet, on various image classification datasets. Extensive experiments have shown CPMC outperforms the others significantly.