This paper reports the impacts of temperature variation on the inference accuracy of pre-trained all-ferroelectric FinFET deep neural networks, along with plausible design techniques to abate these impacts. We adopted a pre-trained artificial neural network (NN) with 96.4% inference accuracy on the MNIST dataset as the baseline. As an aftermath of temperature change, the conductance drift of a programmed cell was captured by a compact model over a wide range of gate bias. We observe a significant inference accuracy degradation in the analog neural network at 233 K for a NN trained at 300 K. Finally, we deployed binary neural networks with "read voltage" optimization to ensure immunity of NN to accuracy degradation under temperature variation, maintaining an inference accuracy 96.1%