We design statistical hypothesis tests for performing leak detection in water pipeline channels. By applying an appropriate model for signal propagation, we show that the detection problem becomes one of distinguishing signal from noise, with the noise being described by a multivariate Gaussian distribution with unknown covariance matrix. We first design a test procedure based on the generalized likelihood ratio test, which we show through simulations to offer appreciable leak detection performance gain over conventional approaches designed in an analogous context (for radar detection). Our proposed method requires estimation of the noise covariance matrix, which can become inaccurate under high-dimensional settings, and when the measurement data is scarce. To deal with this, we present a second leak detection method, which employs a regularized covariance matrix estimate. The regularization parameter is optimized for the leak detection application by applying results from large dimensional random matrix theory. This second proposed approach is shown to yield improved performance in leak detection compared with the first approach, at the expense of requiring higher computational complexity.