Abstract:Rescue robotics sets high requirements to perception algorithms due to the unstructured and potentially vision-denied environments. Pivoting Frequency-Modulated Continuous Wave radars are an emerging sensing modality for SLAM in this kind of environment. However, the complex noise characteristics of radar SLAM makes, particularly indoor, applications computationally demanding and slow. In this work, we introduce a novel radar SLAM framework, RaNDT SLAM, that operates fast and generates accurate robot trajectories. The method is based on the Normal Distributions Transform augmented by radar intensity measures. Motion estimation is based on fusion of motion model, IMU data, and registration of the intensity-augmented Normal Distributions Transform. We evaluate RaNDT SLAM in a new benchmark dataset and the Oxford Radar RobotCar dataset. The new dataset contains indoor and outdoor environments besides multiple sensing modalities (LiDAR, radar, and IMU).
Abstract:Imaging radar is an emerging sensor modality in the context of Localization and Mapping (SLAM), especially suitable for vision-obstructed environments. This article investigates the use of 4D imaging radars for SLAM and analyzes the challenges in robust loop closure. Previous work indicates that 4D radars, together with inertial measurements, offer ample information for accurate odometry estimation. However, the low field of view, limited resolution, and sparse and noisy measurements render loop closure a significantly more challenging problem. Our work builds on the previous work - TBV SLAM - which was proposed for robust loop closure with 360$^\circ$ spinning radars. This article highlights and addresses challenges inherited from a directional 4D radar, such as sparsity, noise, and reduced field of view, and discusses why the common definition of a loop closure is unsuitable. By combining multiple quality measures for accurate loop closure detection adapted to 4D radar data, significant results in trajectory estimation are achieved; the absolute trajectory error is as low as 0.46 m over a distance of 1.8 km, with consistent operation over multiple environments.
Abstract:This article describes the method CFEAR Radar odometry, submitted to a competition at the Radar in Robotics workshop, ICRA 20241. CFEAR is an efficient and accurate method for spinning 2D radar odometry that generalizes well across environments. This article presents an overview of the odometry pipeline with new experiments on the public Boreas dataset. We show that a real-time capable configuration of CFEAR - with its original parameter set - yields surprisingly low drift in the Boreas dataset. Additionally, we discuss an improved implementation and solving strategy that enables the most accurate configuration to run in real-time with improved robustness, reaching as low as 0.61% translation drift at a frame rate of 68 Hz. A recent release of the source code is available to the community https://github.com/dan11003/CFEAR_Radarodometry_code_public, and we publish the evaluation from this article on https://github.com/dan11003/cfear_2024_workshop