Picture for David Wisth

David Wisth

Hilti-Oxford Dataset: A Millimetre-Accurate Benchmark for Simultaneous Localization and Mapping

Add code
Aug 21, 2022
Figure 1 for Hilti-Oxford Dataset: A Millimetre-Accurate Benchmark for Simultaneous Localization and Mapping
Figure 2 for Hilti-Oxford Dataset: A Millimetre-Accurate Benchmark for Simultaneous Localization and Mapping
Figure 3 for Hilti-Oxford Dataset: A Millimetre-Accurate Benchmark for Simultaneous Localization and Mapping
Figure 4 for Hilti-Oxford Dataset: A Millimetre-Accurate Benchmark for Simultaneous Localization and Mapping
Viaarxiv icon

Team CERBERUS Wins the DARPA Subterranean Challenge: Technical Overview and Lessons Learned

Add code
Jul 11, 2022
Figure 1 for Team CERBERUS Wins the DARPA Subterranean Challenge: Technical Overview and Lessons Learned
Figure 2 for Team CERBERUS Wins the DARPA Subterranean Challenge: Technical Overview and Lessons Learned
Figure 3 for Team CERBERUS Wins the DARPA Subterranean Challenge: Technical Overview and Lessons Learned
Figure 4 for Team CERBERUS Wins the DARPA Subterranean Challenge: Technical Overview and Lessons Learned
Viaarxiv icon

CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge

Add code
Jan 18, 2022
Figure 1 for CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge
Figure 2 for CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge
Figure 3 for CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge
Figure 4 for CERBERUS: Autonomous Legged and Aerial Robotic Exploration in the Tunnel and Urban Circuits of the DARPA Subterranean Challenge
Viaarxiv icon

Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry

Add code
Sep 13, 2021
Figure 1 for Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Figure 2 for Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Figure 3 for Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Figure 4 for Balancing the Budget: Feature Selection and Tracking for Multi-Camera Visual-Inertial Odometry
Viaarxiv icon

VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots

Add code
Jul 15, 2021
Figure 1 for VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
Figure 2 for VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
Figure 3 for VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
Figure 4 for VILENS: Visual, Inertial, Lidar, and Leg Odometry for All-Terrain Legged Robots
Viaarxiv icon

Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry

Add code
Nov 13, 2020
Figure 1 for Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry
Figure 2 for Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry
Figure 3 for Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry
Figure 4 for Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry
Viaarxiv icon

The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth

Add code
Mar 12, 2020
Figure 1 for The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth
Figure 2 for The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth
Figure 3 for The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth
Figure 4 for The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth
Viaarxiv icon

Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry

Add code
Oct 22, 2019
Figure 1 for Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry
Figure 2 for Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry
Figure 3 for Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry
Figure 4 for Preintegrated Velocity Bias Estimation to Overcome Contact Nonlinearities in Legged Robot Odometry
Viaarxiv icon

Robust Legged Robot State Estimation Using Factor Graph Optimization

Add code
Apr 05, 2019
Figure 1 for Robust Legged Robot State Estimation Using Factor Graph Optimization
Figure 2 for Robust Legged Robot State Estimation Using Factor Graph Optimization
Figure 3 for Robust Legged Robot State Estimation Using Factor Graph Optimization
Figure 4 for Robust Legged Robot State Estimation Using Factor Graph Optimization
Viaarxiv icon