Abstract:Accurate localization is a core component of a robot's navigation system. To this end, global navigation satellite systems (GNSS) can provide absolute measurements outdoors and, therefore, eliminate long-term drift. However, fusing GNSS data with other sensor data is not trivial, especially when a robot moves between areas with and without sky view. We propose a robust approach that tightly fuses raw GNSS receiver data with inertial measurements and, optionally, lidar observations for precise and smooth mobile robot localization. A factor graph with two types of GNSS factors is proposed. First, factors based on pseudoranges, which allow for global localization on Earth. Second, factors based on carrier phases, which enable highly accurate relative localization, which is useful when other sensing modalities are challenged. Unlike traditional differential GNSS, this approach does not require a connection to a base station. On a public urban driving dataset, our approach achieves accuracy comparable to a state-of-the-art algorithm that fuses visual inertial odometry with GNSS data -- despite our approach not using the camera, just inertial and GNSS data. We also demonstrate the robustness of our approach using data from a car and a quadruped robot moving in environments with little sky visibility, such as a forest. The accuracy in the global Earth frame is still 1-2 m, while the estimated trajectories are discontinuity-free and smooth. We also show how lidar measurements can be tightly integrated. We believe this is the first system that fuses raw GNSS observations (as opposed to fixes) with lidar.