Log-Euclidean distances are commonly used to quantify the similarity between positive definite matrices using geometric considerations. This paper analyzes the behavior of this distance when it is used to measure closeness between independent sample covariance matrices. A closed form expression is given for the deterministic equivalent of such distance, which asymptotically approximates the actual distance in the large observation regime (both sample size and observation dimension grow to infinity at the same rate). The deterministic equivalent can be used to analyze the performance of the log-Euclidean metric when compared to other commonly used metrics such as the Euclidean norm or the symmetrized Kullback-Leibler divergence.