Matrix completion has received vast amount of attention and research due to its wide applications in various study fields. Existing methods of matrix completion consider only nonlinear (or linear) relations among entries in a data matrix and ignore linear (or nonlinear) relationships latent. This paper introduces a new latent variables model for data matrix which is a combination of linear and nonlinear models and designs a novel deep-neural-network-based matrix completion algorithm to address both linear and nonlinear relations among entries of data matrix. The proposed method consists of two branches. The first branch learns the latent representations of columns and reconstructs the columns of the partially observed matrix through a series of hidden neural network layers. The second branch does the same for the rows. In addition, based on multi-task learning principles, we enforce these two branches work together and introduce a new regularization technique to reduce over-fitting. More specifically, the missing entries of data are recovered as a main task and manifold learning is performed as an auxiliary task. The auxiliary task constrains the weights of the network so it can be considered as a regularizer, improving the main task and reducing over-fitting. Experimental results obtained on the synthetic data and several real-world data verify the effectiveness of the proposed method compared with state-of-the-art matrix completion methods.