We introduce a generalization of the linearized Alternating Direction Method of Multipliers to optimize a real-valued function $f$ of multiple arguments with potentially multiple constraints $g_\circ$ on each of them. The function $f$ may be nonconvex as long as it is convex in every argument, while the constraints $g_\circ$ need to be convex but not smooth. If $f$ is smooth, the proposed Block-Simultaneous Direction Method of Multipliers (bSDMM) can be interpreted as a proximal analog to inexact coordinate descent methods under constraints. Unlike alternative approaches for joint solvers of multiple-constraint problems, we do not require linear operators $L$ of a constraint function $g(L\ \cdot)$ to be invertible or linked between each other. bSDMM is well-suited for a range of optimization problems, in particular for data analysis, where $f$ is the likelihood function of a model and $L$ could be a transformation matrix describing e.g. finite differences or basis transforms. We apply bSDMM to the Non-negative Matrix Factorization task of a hyperspectral unmixing problem and demonstrate convergence and effectiveness of multiple constraints on both matrix factors. The algorithms are implemented in python and released as an open-source package.