We address the Normalized Signal to Noise Ratio (NSNR) metric defined in the seminal paper by Reed, Mallett and Brennan on adaptive detection. The setting is detection of a target vector in additive correlated noise. NSNR is the ratio between the SNR of a linear detector which uses an estimated noise covariance and the SNR of clairvoyant detector based on the exact unknown covariance. It is not obvious how to evaluate NSNR since it is a function of the target vector. To close this gap, we consider the NSNR associated with the worst target. Using the Kantorovich Inequality, we provide a closed form solution for the worst case NSNR. Then, we prove that the classical Gaussian Kullback Leibler (KL) divergence bounds it. Numerical experiments with different true covariances and various estimates also suggest that the KL metric is more correlated with the NSNR metric than competing norm based metrics.