The Huber loss is a robust loss function used for a wide range of regression tasks. To utilize the Huber loss, a parameter that controls the transitions from a quadratic function to an absolute value function needs to be selected. We believe the standard probabilistic interpretation that relates the Huber loss to the so-called Huber density fails to provide adequate intuition for identifying the transition point. As a result, hyper-parameter search is often necessary to determine an appropriate value. In this work, we propose an alternative probabilistic interpretation of the Huber loss, which relates minimizing the Huber loss to minimizing an upper-bound on the Kullback-Leibler divergence between Laplace distributions. Furthermore, we show that the parameters of the Laplace distributions are directly related to the transition point of the Huber loss. We demonstrate through a case study and experimentation on the Faster R-CNN object detector that our interpretation provides an intuitive way to select well-suited hyper-parameters.