One of the key approximations to range simulation is downscaling the image, dictated by the natural trigonometric relationships that arise due to long-distance viewing. It is well-known that standard downsampling applied to an image without prior low-pass filtering leads to a type of signal distortion called $aliasing$. In this study, we aim at modeling the distortion due to aliasing and show that a downsampled/upsampled image after an interpolation process can be very well approximated through the application of isotropic Gaussian low-pass filtering to the original image. In other words, the distortion due to aliasing can approximately be generated by low-pass filtering the image with a carefully determined cut-off frequency. We have found that the standard deviation of the isotropic Gaussian kernel $\sigma$ and the reduction factor $m$ (also called downsampling ratio) satisfy an approximate $m \approx 2 \sigma$ relationship. We provide both theoretical and practical arguments using two relatively small face datasets (Chicago DB, LRFID) as well as TinyImageNet to corroborate this empirically observed relationship.