Abstract:Spinning, frequency-modulated continuous-wave (FMCW) radar has been gaining popularity for autonomous vehicle navigation. The spinning radar is chosen over the more classic automotive `fixed' radar as it is able to capture the full 360 degree field of view without requiring multiple sensors and extensive calibration. However, commercially available spinning radar systems have not previously had the ability to extract radial velocities due to the lack of repeated measurements in the same direction and fundamental hardware setup. A new firmware upgrade now makes it possible to alternate the modulation of the radar signal between azimuths. In this paper, we first present a way to use this alternating modulation to extract radial Doppler velocity measurements from single raw radar intensity scans. We then incorporate these measurements in two different modern odometry pipelines and evaluate them in progressively challenging autonomous driving environments. We show that using Doppler velocity measurements enables our odometry to continue functioning at state-of-the-art even in severely geometrically degenerate environments.
Abstract:In this technical report, we compare treating an IMU as an input to a motion model against treating it as a measurement of the state in a continuous-time state estimation framework. Treating IMU measurements as inputs to a motion model and then preintegrating these measurements has almost become a de-facto standard in many robotics applications. However, this approach has a few shortcomings. First, it conflates the IMU measurement noise with the underlying process noise. Second, it is unclear how the state will be propagated in the case of IMU measurement dropout. Third, it does not lend itself well to dealing with multiple high-rate sensors such as a lidar and an IMU or multiple IMUs. In this work, we methodically compare the performance of these two approaches on a 1D simulation and show that they perform identically, assuming that each method's hyperparameters have been tuned on a training set. We show how to preintegrate heterogeneous factors using Gaussian process interpolation. We also provide results for our continuous-time lidar-inertial odometry in simulation and on the Newer College Dataset. Code for our lidar-inertial odometry can be found at: https://github.com/utiasASRL/steam_icp
Abstract:In this work, we demonstrate continuous-time radar-inertial and lidar-inertial odometry using a Gaussian process motion prior. Using a sparse prior, we demonstrate improved computational complexity during preintegration and interpolation. We use a white-noise-on-acceleration motion prior and treat the gyroscope as a direct measurement of the state while preintegrating accelerometer measurements to form relative velocity factors. Our odometry is implemented using sliding-window batch trajectory estimation. To our knowledge, our work is the first to demonstrate radar-inertial odometry with a spinning mechanical radar using both gyroscope and accelerometer measurements. We improve the performance of our radar odometry by 19\% by incorporating an IMU. Our approach is efficient and we demonstrate real-time performance. Code for this project can be found at: https://github.com/utiasASRL/steam_icp
Abstract:This paper presents a novel deep-learning-based approach to improve localizing radar measurements against lidar maps. Although the state of the art for localization is matching lidar data to lidar maps, radar has been considered as a promising alternative, as it is potentially more resilient against adverse weather such as precipitation and heavy fog. To make use of existing high-quality lidar maps, while maintaining performance in adverse weather, matching radar data to lidar maps is of interest. However, owing in part to the unique artefacts present in radar measurements, radar-lidar localization has struggled to achieve comparable performance to lidar-lidar systems, preventing it from being viable for autonomous driving. This work builds on an ICP-based radar-lidar localization system by including a learned preprocessing step that weights radar points based on high-level scan information. Combining a proven analytical approach with a learned weight reduces localization errors in radar-lidar ICP results run on real-world autonomous driving data by up to 54.94% in translation and 68.39% in rotation, while maintaining interpretability and robustness.
Abstract:In this paper, we present a fast, lightweight odometry method that uses the Doppler velocity measurements from a Frequency-Modulated Continuous-Wave (FMCW) lidar without data association. FMCW lidar is a recently emerging technology that enables per-return relative radial velocity measurements via the Doppler effect. Since the Doppler measurement model is linear with respect to the 6-degrees-of-freedom (DOF) vehicle velocity, we can formulate a linear continuous-time estimation problem for the velocity and numerically integrate for the 6-DOF pose estimate afterward. The caveat is that angular velocity is not observable with a single FMCW lidar. We address this limitation by also incorporating the angular velocity measurements from a gyroscope. This results in an extremely efficient odometry method that processes lidar frames at an average wall-clock time of 5.8ms on a single thread, well below the 10Hz operating rate of the lidar we tested. We show experimental results on real-world driving sequences and compare against state-of-the-art Iterative Closest Point (ICP)-based odometry methods, presenting a compelling trade-off between accuracy and computation. We also present an algebraic observability study, where we demonstrate in theory that the Doppler measurements from multiple FMCW lidars are capable of observing all 6 degrees of freedom (translational and angular velocity).
Abstract:Frequency-Modulated Continuous-Wave (FMCW) lidar is a recently emerging technology that additionally enables per-return instantaneous relative radial velocity measurements via the Doppler effect. In this letter, we present the first continuous-time lidar-only odometry algorithm using these Doppler velocity measurements from an FMCW lidar to aid odometry in geometrically degenerate environments. We apply an existing continuous-time framework that efficiently estimates the vehicle trajectory using Gaussian process regression to compensate for motion distortion due to the scanning-while-moving nature of any mechanically actuated lidar (FMCW and non-FMCW). We evaluate our proposed algorithm on several real-world datasets, including publicly available ones and datasets we collected. Our algorithm outperforms the only existing method that also uses Doppler velocity measurements, and we study difficult conditions where including this extra information greatly improves performance. We additionally demonstrate state-of-the-art performance of lidar-only odometry with and without using Doppler velocity measurements in nominal conditions. Code for this project can be found at: https://github.com/utiasASRL/steam_icp.
Abstract:Radar is a rich sensing modality that is a compelling alternative to lidar for its inherent robustness to precipitation, dust, and fog. In this paper, we investigate radar-based mapping and localization as an alternative to lidar by demonstrating the performance of our own radar-based pipeline. We present extensive comparisons against a lidar-based pipeline across varying seasonal and weather conditions using our own publicly available dataset collected on a repeated route over the course of one year. Our experiments show that while lidar achieves better overall localization accuracy, our radar-based pipeline achieves comparable SE(2) results using a fraction of the computational and memory resources required by the lidar-based pipeline. We additionally demonstrate our radar-based pipeline localizing against lidar maps with better accuracy than the current state of the art for radar-to-lidar localization. Code for this project can be found at: https://github.com/utiasASRL/vtr3
Abstract:The Boreas dataset was collected by driving a repeated route over the course of one year, resulting in stark seasonal variations and adverse weather conditions such as rain and falling snow. In total, the Boreas dataset contains over 350km of driving data featuring a 128-channel Velodyne Alpha-Prime lidar, a 360 degree Navtech CIR304-H scanning radar, a 5MP FLIR Blackfly S camera, and centimetre-accurate post-processed ground truth poses. At launch, our dataset will support live leaderboards for odometry, metric localization, and 3D object detection. The dataset and development kit are available at: https://www.boreas.utias.utoronto.ca
Abstract:This paper presents a radar odometry method that combines probabilistic trajectory estimation and deep learned features without needing groundtruth pose information. The feature network is trained unsupervised, using only the on-board radar data. With its theoretical foundation based on a data likelihood objective, our method leverages a deep network for processing rich radar data, and a non-differentiable classic estimator for probabilistic inference. We provide extensive experimental results on both the publicly available Oxford Radar RobotCar Dataset and an additional 100 km of driving collected in an urban setting. Our sliding-window implementation of radar odometry outperforms most hand-crafted methods and approaches the current state of the art without requiring a groundtruth trajectory for training. We also demonstrate the effectiveness of radar odometry under adverse weather conditions. Code for this project can be found at: https://github.com/utiasASRL/hero_radar_odometry
Abstract:In order to tackle the challenge of unfavorable weather conditions such as rain and snow, radar is being revisited as a parallel sensing modality to vision and lidar. Recent works have made tremendous progress in applying radar to odometry and place recognition. However, these works have so far ignored the impact of motion distortion and Doppler effects on radar-based navigation, which may be significant in the self-driving car domain where speeds can be high. In this work, we demonstrate the effect of these distortions on radar-only odometry using the Oxford Radar RobotCar Dataset and metric localization using our own data-taking platform. We present a lightweight estimator that can recover the motion between a pair of radar scans while accounting for both effects. Our conclusion is that both motion distortion and the Doppler effect are significant in different aspects of radar navigation, with the former more prominent than the latter.