Today, many of the aid systems deployed for visually impaired people are mostly made for a single purpose. Be it navigation, object detection, or distance perceiving. Also, most of the deployed aid systems use indoor navigation which requires a pre-knowledge of the environment. These aid systems often fail to help visually impaired people in the unfamiliar scenario. In this paper, we propose an aid system developed using object detection and depth perceivement to navigate a person without dashing into an object. The prototype developed detects 90 different types of objects and compute their distances from the user. We also, implemented a navigation feature to get input from the user about the target destination and hence, navigate the impaired person to his/her destination using Google Directions API. With this system, we built a multi-feature, high accuracy navigational aid system which can be deployed in the wild and help the visually impaired people in their daily life by navigating them effortlessly to their desired destination.