With respect to biological findings underlying fly's physiology in the past decade, we present a directionally selective neural network, with a feed-forward structure and entirely low-level visual processing, so as to implement direction selective neurons in the fly's visual system, which are mainly sensitive to wide-field translational movements in four cardinal directions. In this research, we highlight the functionality of ON and OFF pathways, separating motion information for parallel computation corresponding to light-on and light-off selectivity. Through this modeling study, we demonstrate several achievements compared with former bio-plausible translational motion detectors, like the elementary motion detectors. First, we thoroughly mimic the fly's preliminary motion-detecting pathways with newly revealed fly's physiology. Second, we improve the speed response to moving dark/light features via the design of ensembles of same polarity cells in the dual-pathways. Moreover, we alleviate the impact of irrelevant motion in a visually cluttered environment like the shifting of background and windblown vegetation, via the modeling of spatiotemporal dynamics. We systematically tested the DSNN against stimuli ranging from synthetic and real-world scenes, to notably a visual modality of a ground micro robot. The results demonstrated that the DSNN outperforms former bio-plausible translational motion detectors. Importantly, we verified its computational simplicity and effectiveness benefiting the building of neuromorphic vision sensor for robots.