Collision detection is one of the most challenging tasks for Unmanned Aerial Vehicles (UAVs), especially for small or micro UAVs with limited computational power. In nature, fly insects with compact and simple visual systems demonstrate the amazing ability to navigating and avoid collision in a complex environment. A good example of this is locusts. Locusts avoid collision in a dense swarm relying on an identified vision neuron called Lobula Giant Movement Detector (LGMD) which has been modelled and applied on ground robots and vehicles. LGMD as a fly insect's visual neuron, is an ideal model for UAV collision detection. However, the existing models are inadequate in coping with complex visual challenges unique for UAVs. In this paper, we proposed a new LGMD model for flying robots considering distributed spatial-temporal computing for both excitation and inhibition to enhance the looming selectivity in flying scenes. The proposed model integrated recent discovered presynaptic connection types in biological LGMD neuron into a spatial-temporal filter with linear distributed interconnection. Systematic experiments containing quadcopter's first person view (FPV) flight videos demonstrated that the proposed distributed presynaptic structure can dramatically enhance LGMD's looming selectivity especially in complex flying UAV applications.