Abstract:(Abridged) Quantitative three-dimensional (3D) position and velocity estimates obtained by passive radar will assist the Galileo Project in the detection and classification of aerial objects by providing critical measurements of range, location, and kinematics. These parameters will be combined with those derived from the Project{\textquoteright}s suite of electromagnetic sensors and used to separate known aerial objects from those exhibiting anomalous kinematics. SkyWatch, a passive multistatic radar system based on commercial broadcast FM radio transmitters of opportunity, is a network of receivers spaced at geographical scales that enables estimation of the 3D position and velocity time series of objects at altitudes up to 80km, horizontal distances up to 150km, and at velocities to {\textpm}2{\textpm}2km/s ({\textpm}6{\textpm}6Mach). The receivers are designed to collect useful data in a variety of environments varying by terrain, transmitter power, relative transmitter distance, adjacent channel strength, etc. In some cases, the direct signal from the transmitter may be large enough to be used as the reference with which the echoes are correlated. In other cases, the direct signal may be weak or absent, in which case a reference is communicated to the receiver from another network node via the internet for echo correlation. Various techniques are discussed specific to the two modes of operation and a hybrid mode. Delay and Doppler data are sent via internet to a central server where triangulation is used to deduce time series of 3D positions and velocities. A multiple receiver (multistatic) radar experiment is undergoing Phase 1 testing, with several receivers placed at various distances around the Harvard{\textendash}Smithsonian Center for Astrophysics (CfA), to validate full 3D position and velocity recovery.
Abstract:(Abridged) Unidentified Aerial Phenomena (UAP) have resisted explanation and have received little formal scientific attention for 75 years. A primary objective of the Galileo Project is to build an integrated software and instrumentation system designed to conduct a multimodal census of aerial phenomena and to recognize anomalies. Here we present key motivations for the study of UAP and address historical objections to this research. We describe an approach for highlighting outlier events in the high-dimensional parameter space of our census measurements. We provide a detailed roadmap for deciding measurement requirements, as well as a science traceability matrix (STM) for connecting sought-after physical parameters to observables and instrument requirements. We also discuss potential strategies for deciding where to locate instruments for development, testing, and final deployment. Our instrument package is multimodal and multispectral, consisting of (1) wide-field cameras in multiple bands for targeting and tracking of aerial objects and deriving their positions and kinematics using triangulation; (2) narrow-field instruments including cameras for characterizing morphology, spectra, polarimetry, and photometry; (3) passive multistatic arrays of antennas and receivers for radar-derived range and kinematics; (4) radio spectrum analyzers to measure radio and microwave emissions; (5) microphones for sampling acoustic emissions in the infrasonic through ultrasonic frequency bands; and (6) environmental sensors for characterizing ambient conditions (temperature, pressure, humidity, and wind velocity), as well as quasistatic electric and magnetic fields, and energetic particles. The use of multispectral instruments and multiple sensor modalities will help to ensure that artifacts are recognized and that true detections are corroborated and verifiable.
Abstract:To date, there are little reliable data on the position, velocity and acceleration characteristics of Unidentified Aerial Phenomena (UAP). The dual hardware and software system described in this document provides a means to address this gap. We describe a weatherized multi-camera system which can capture images in the visible, infrared and near infrared wavelengths. We then describe the software we will use to calibrate the cameras and to robustly localize objects-of-interest in three dimensions. We show how object localizations captured over time will be used to compute the velocity and acceleration of airborne objects.