Abstract:Robotic perception is emerging as a crucial technology for navigation aids, particularly benefiting individuals with visual impairments through sonification. This paper presents a novel mapping framework that accurately represents spatial geometry for sonification, transforming physical spaces into auditory experiences. By leveraging depth sensors, we convert incrementally built 3D scenes into a compact 360-degree representation based on angular and distance information, aligning with human auditory perception. Our proposed mapping framework utilises a sensor-centric structure, maintaining 2D circular or 3D cylindrical representations, and employs the VDB-GPDF for efficient online mapping. We introduce two sonification modes-circular ranging and circular ranging of objects-along with real-time user control over auditory filters. Incorporating binaural room impulse responses, our framework provides perceptually robust auditory feedback. Quantitative and qualitative evaluations demonstrate superior performance in accuracy, coverage, and timing compared to existing approaches, with effective handling of dynamic objects. The accompanying video showcases the practical application of spatial sonification in room-like environments.
Abstract:Navigating cluttered environments is a challenging task for any mobile system. Existing approaches for ground-based mobile systems primarily focus on small wheeled robots, which face minimal constraints with overhanging obstacles and cannot manage steps or stairs, making the problem effectively 2D. However, navigation for legged robots (or even humans) has to consider an extra dimension. This paper proposes a tailored scene representation coupled with an advanced trajectory optimisation algorithm to enable safe navigation. Our 3D navigation approach is suitable for any ground-based mobile robot, whether wheeled or legged, as well as for human assistance. Given a 3D point cloud of the scene and the segmentation of the ground and non-ground points, we formulate two Gaussian Process distance fields to ensure a collision-free path and maintain distance to the ground constraints. Our method adeptly handles uneven terrain, steps, and overhanging objects through an innovative use of a quadtree structure, constructing a multi-resolution map of the free space and its connectivity graph based on a 2D projection of the relevant scene. Evaluations with both synthetic and real-world datasets demonstrate that this approach provides safe and smooth paths, accommodating a wide range of ground-based mobile systems.