Abstract:Autonomous drone racing requires powerful perception, planning, and control and has become a benchmark and test field for autonomous, agile flight. Existing work usually assumes static race tracks with known maps, which enables offline planning of time-optimal trajectories, performing localization to the gates to reduce the drift in visual-inertial odometry (VIO) for state estimation or training learning-based methods for the particular race track and operating environment. In contrast, many real-world tasks like disaster response or delivery need to be performed in unknown and dynamic environments. To close this gap and make drone racing more robust against unseen environments and moving gates, we propose a control algorithm that does not require a race track map or VIO and uses only monocular measurements of the line of sight (LOS) to the gates. For this purpose, we adopt the law of proportional navigation (PN) to accurately fly through the gates despite gate motions or wind. We formulate the PN-informed vision-based control problem for drone racing as a constrained optimization problem and derive a closed-form optimal solution. We demonstrate through extensive simulations and real-world experiments that our method can navigate through moving gates at high speeds while being robust to different gate movements, model errors, wind, and delays.