Abstract:Foundation models, e.g., large language models, possess attributes of intelligence which offer promise to endow a robot with the contextual understanding necessary to navigate complex, unstructured tasks in the wild. In the future of space robotics, we see three core challenges which motivate the use of a foundation model adapted to space-based applications: 1) Scalability of ground-in-the-loop operations; 2) Generalizing prior knowledge to novel environments; and 3) Multi-modality in tasks and sensor data. Therefore, as a first-step towards building a foundation model for space-based applications, we automatically label the AI4Mars dataset to curate a language annotated dataset of visual-question-answer tuples. We fine-tune a pretrained LLaVA checkpoint on this dataset to endow a vision-language model with the ability to perform spatial reasoning and navigation on Mars' surface. In this work, we demonstrate that 1) existing vision-language models are deficient visual reasoners in space-based applications, and 2) fine-tuning a vision-language model on extraterrestrial data significantly improves the quality of responses even with a limited training dataset of only a few thousand samples.
Abstract:This paper presents initial flight results for distributed optical angles-only navigation of a swarm of small spacecraft, conducted during the Starling Formation-Flying Optical Experiment (StarFOX). StarFOX is a core payload of the NASA Starling mission, which consists of four CubeSats launched in 2023. Prior angles-only flight demonstrations have only featured one observer and target and have relied upon a-priori target orbit knowledge for initialization, translational maneuvers to resolve target range, and external absolute orbit updates to maintain convergence. StarFOX overcomes these limitations by applying the angles-only Absolute and Relative Trajectory Measurement System (ARTMS), which integrates three novel algorithms. Image Processing detects and tracks multiple targets in images from each satellite's on-board camera. Batch Orbit Determination computes initial swarm orbit estimates from bearing angle batches. Sequential Orbit Determination leverages an unscented Kalman filter to refine swarm state estimates over time. Multi-observer measurements shared over an intersatellite link are seamlessly fused to enable absolute and relative orbit determination. StarFOX flight data presents the first demonstrations of autonomous angles-only navigation for a satellite swarm, including multi-target and multi-observer relative navigation; autonomous initialization of navigation for unknown targets; and simultaneous absolute and relative orbit determination. Relative positioning uncertainties of 1.3% of target range (1$\sigma$) are achieved for a single observer under challenging measurement conditions, reduced to 0.6% (1$\sigma$) with multiple observers. Results demonstrate promising performance with regards to ongoing StarFOX campaigns and the application of angles-only navigation to future distributed missions.