Abstract:We design a functional that is capable of quantifying the amount of global correlations encoded in a given probability distribution $\rho$, by imposing what we call the \textit{Principle of Constant Correlations} (PCC) and using eliminative induction. The residual functional after eliminative induction is the mutual information (MI) and therefore the MI is designed to quantify the amount of global correlations encoded in $\rho$. The MI is the unique functional capable of determining whether a certain class of inferential transformations, $\rho\xrightarrow{*}\rho'$, preserve, destroy or create correlations. Further, Our design derivation allows us to improve the notion and efficacy of statistical sufficiency by expressing it in terms of a normalized MI that represents the percentage in which a statistic or transformation is a sufficient.