Picture for Matteo Palieri

Matteo Palieri

SEEK: Semantic Reasoning for Object Goal Navigation in Real World Inspection Tasks

Add code
May 16, 2024
Figure 1 for SEEK: Semantic Reasoning for Object Goal Navigation in Real World Inspection Tasks
Figure 2 for SEEK: Semantic Reasoning for Object Goal Navigation in Real World Inspection Tasks
Figure 3 for SEEK: Semantic Reasoning for Object Goal Navigation in Real World Inspection Tasks
Figure 4 for SEEK: Semantic Reasoning for Object Goal Navigation in Real World Inspection Tasks
Viaarxiv icon

Present and Future of SLAM in Extreme Underground Environments

Add code
Aug 02, 2022
Figure 1 for Present and Future of SLAM in Extreme Underground Environments
Figure 2 for Present and Future of SLAM in Extreme Underground Environments
Figure 3 for Present and Future of SLAM in Extreme Underground Environments
Figure 4 for Present and Future of SLAM in Extreme Underground Environments
Viaarxiv icon

LAMP 2.0: A Robust Multi-Robot SLAM System for Operation in Challenging Large-Scale Underground Environments

Add code
May 31, 2022
Figure 1 for LAMP 2.0: A Robust Multi-Robot SLAM System for Operation in Challenging Large-Scale Underground Environments
Figure 2 for LAMP 2.0: A Robust Multi-Robot SLAM System for Operation in Challenging Large-Scale Underground Environments
Figure 3 for LAMP 2.0: A Robust Multi-Robot SLAM System for Operation in Challenging Large-Scale Underground Environments
Figure 4 for LAMP 2.0: A Robust Multi-Robot SLAM System for Operation in Challenging Large-Scale Underground Environments
Viaarxiv icon

LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time Underground 3D Mapping

Add code
May 24, 2022
Figure 1 for LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time Underground 3D Mapping
Figure 2 for LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time Underground 3D Mapping
Figure 3 for LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time Underground 3D Mapping
Figure 4 for LOCUS 2.0: Robust and Computationally Efficient Lidar Odometry for Real-Time Underground 3D Mapping
Viaarxiv icon

NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge

Add code
Mar 28, 2021
Figure 1 for NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge
Figure 2 for NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge
Figure 3 for NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge
Figure 4 for NeBula: Quest for Robotic Autonomy in Challenging Environments; TEAM CoSTAR at the DARPA Subterranean Challenge
Viaarxiv icon

DARE-SLAM: Degeneracy-Aware and Resilient Loop Closing in Perceptually-Degraded Environments

Add code
Feb 09, 2021
Figure 1 for DARE-SLAM: Degeneracy-Aware and Resilient Loop Closing in Perceptually-Degraded Environments
Figure 2 for DARE-SLAM: Degeneracy-Aware and Resilient Loop Closing in Perceptually-Degraded Environments
Figure 3 for DARE-SLAM: Degeneracy-Aware and Resilient Loop Closing in Perceptually-Degraded Environments
Figure 4 for DARE-SLAM: Degeneracy-Aware and Resilient Loop Closing in Perceptually-Degraded Environments
Viaarxiv icon

Autonomous Spot: Long-Range Autonomous Exploration of Extreme Environments with Legged Locomotion

Add code
Nov 01, 2020
Figure 1 for Autonomous Spot: Long-Range Autonomous Exploration of Extreme Environments with Legged Locomotion
Figure 2 for Autonomous Spot: Long-Range Autonomous Exploration of Extreme Environments with Legged Locomotion
Figure 3 for Autonomous Spot: Long-Range Autonomous Exploration of Extreme Environments with Legged Locomotion
Figure 4 for Autonomous Spot: Long-Range Autonomous Exploration of Extreme Environments with Legged Locomotion
Viaarxiv icon

LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments

Add code
Mar 05, 2020
Figure 1 for LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments
Figure 2 for LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments
Figure 3 for LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments
Figure 4 for LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments
Viaarxiv icon