Abstract:We propose a novel neural network structure called CrossNets, which considers architectures on directed acyclic graphs. This structure builds on previous generalization of sequential feed-forward models, such as ResNets, by allowing for all forward cross-connections between both adjacent and non-adjacent layers. The addition of cross-connections within the network increases the information flow across the whole network, leading to better training and testing performances. The superior performance of the network is tested against both image classification and compression tasks using various datasets, such as MNIST, FER, CIFAR-10, CIFAR-100, and SVHN. We conclude with a proof of convergence for CrossNets to a local minimum for error, where weights for connections are chosen through backpropagation with momentum.