Abstract:This paper introduces a scenario where a maneuverable satellite in geostationary orbit (GEO) conducts on-orbit attacks, targeting communication between a GEO satellite and a ground station, with the ability to switch between stationary and time-variant jamming modes. We propose a machine learning-based detection approach, employing the random forest algorithm with principal component analysis (PCA) to enhance detection accuracy in the stationary model. At the same time, an adaptive threshold-based technique is implemented for the time-variant model to detect dynamic jamming events effectively. Our methodology emphasizes the need for the use of orbital dynamics in integrating physical constraints from satellite dynamics to improve model robustness and detection accuracy. Simulation results highlight the effectiveness of PCA in enhancing the performance of the stationary model, while the adaptive thresholding method achieves high accuracy in detecting jamming in the time-variant scenario. This approach provides a robust solution for mitigating the evolving threats to satellite communication in GEO environments.
Abstract:The rapid evolution of communication technologies, compounded by recent geopolitical events such as the Viasat cyberattack in February 2022, has highlighted the urgent need for fast and reliable satellite missions for military and civil security operations. Consequently, this paper examines two Earth observation (EO) missions: one utilizing a single low Earth orbit (LEO) satellite and another through a network of LEO satellites, employing a secure-by-component design strategy. This approach begins by defining the scope of technical security engineering, decomposing the system into components and data flows, and enumerating attack surfaces. Then it proceeds by identifying threats to low-level components, applying secure-by-design principles, redesigning components into secure blocks in alignment with the Space Attack Research & Tactic Analysis (SPARTA) framework, and crafting shall statements to refactor the system design, with a particular focus on improving the security of the link segment.
Abstract:The Moon and its surrounding cislunar space have numerous unknowns, uncertainties, or partially charted phenomena that need to be investigated to determine the extent to which they affect cislunar communication. These include temperature fluctuations, spacecraft distance and velocity dynamics, surface roughness, and the diversity of propagation mechanisms. To develop robust and dynamically operative Cislunar space networks (CSNs), we need to analyze the communication system by incorporating inclusive models that account for the wide range of possible propagation environments and noise characteristics. In this paper, we consider that the communication signal can be subjected to both Gaussian and non-Gaussian noise, but also to different fading conditions. First, we analyze the communication link by showing the relationship between the brightness temperatures of the Moon and the equivalent noise temperature at the receiver of the Lunar Gateway. We propose to analyze the ergodic capacity and the outage probability, as they are essential metrics for the development of reliable communication. In particular, we model the noise with the additive symmetric alpha-stable distribution, which allows a generic analysis for Gaussian and non-Gaussian signal characteristics. Then, we present the closed-form bounds for the ergodic capacity and the outage probability. Finally, the results show the theoretically and operationally achievable performance bounds for the cislunar communication. To give insight into further designs, we also provide our results with comprehensive system settings that include mission objectives as well as orbital and system dynamics.
Abstract:The rise in low Earth orbit (LEO) satellite Internet services has led to increasing demand, often exceeding available data rates and compromising the quality of service. While deploying more satellites offers a short-term fix, designing higher-performance satellites with enhanced transmission capabilities provides a more sustainable solution. Achieving the necessary high capacity requires interconnecting multiple modem banks within a satellite payload. However, there is a notable gap in research on internal packet routing within extremely high-throughput satellites. To address this, we propose a real-time optimal flow allocation and priority queue scheduling method using online convex optimization-based model predictive control. We model the problem as a multi-commodity flow instance and employ an online interior-point method to solve the routing and scheduling optimization iteratively. This approach minimizes packet loss and supports real-time rerouting with low computational overhead. Our method is tested in simulation on a next-generation extremely high-throughput satellite model, demonstrating its effectiveness compared to a reference batch optimization and to traditional methods.
Abstract:This paper addresses the limitations of current satellite payload architectures, which are predominantly hardware-driven and lack the flexibility to adapt to increasing data demands and uneven traffic. To overcome these challenges, we present a novel architecture for future regenerative and programmable satellite payloads and utilize interconnected modem banks to promote higher scalability and flexibility. We formulate an optimization problem to efficiently manage traffic among these modem banks and balance the load. Additionally, we provide comparative numerical simulation results, considering end-to-end delay and packet loss analysis. The results illustrate that our proposed architecture maintains lower delays and packet loss even with higher traffic demands and smaller buffer sizes.
Abstract:Machine unlearning (MUL) is introduced as a means to achieve interference cancellation within artificial intelligence (AI)-enabled wireless systems. It is observed that interference cancellation with MUL demonstrates $30\%$ improvement in a classification task accuracy in the presence of a corrupted AI model. Accordingly, the necessity for instantaneous channel state information for existing interference source is eliminated and a corrupted latent space with interference noise is cleansed with MUL algorithm, achieving this without the necessity for either retraining or dataset cleansing. A Membership Interference Attack (MIA) served as a benchmark for assessing the efficacy of MUL in mitigating interference within a neural network model. The advantage of the MUL algorithm was determined by evaluating both the probability of interference and the quantity of samples requiring retraining. In a simple signal-to-noise ratio classification task, the comprehensive improvement across various test cases in terms of accuracy demonstrates that MUL exhibits extensive capabilities and limitations, particularly in native AI applications.
Abstract:We address the challenge of developing an orthogonal time-frequency space (OTFS)-based non-orthogonal multiple access (NOMA) system where each user is modulated using orthogonal pulses in the delay Doppler domain. Building upon the concept of the sufficient (bi)orthogonality train-pulse [1], we extend this idea by introducing Hermite functions, known for their orthogonality properties. Simulation results demonstrate that our proposed Hermite functions outperform the traditional OTFS-NOMA schemes, including power-domain (PDM) NOMA and code-domain (CDM) NOMA, in terms of bit error rate (BER) over a high-mobility channel. The algorithm's complexity is minimal, primarily involving the demodulation of OTFS. The spectrum efficiency of Hermite-based OTFS-NOMA is K times that of OTFS-CDM-NOMA scheme, where K is the spreading length of the NOMA waveform.
Abstract:Interplanetary links (IPL) serve as crucial enablers for space exploration, facilitating secure and adaptable space missions. An integrated IPL with inter-satellite communication (IP-ISL) establishes a unified deep space network, expanding coverage and reducing atmospheric losses. The challenges, including irregularities in charged density, hardware impairments, and hidden celestial body brightness are analyzed with a reflectarray-based IP-ISL between Earth and Moon orbiters. It is observed that $10^{-8}$ order severe hardware impairments with intense solar plasma density drops an ideal system's spectral efficiency (SE) from $\sim\!38~\textrm{(bit/s)/Hz}$ down to $0~\textrm{(bit/s)/Hz}$. An ideal full angle of arrival fluctuation recovery with full steering range achieves $\sim\!20~\textrm{(bit/s)/Hz}$ gain and a limited beamsteering with a numerical reflectarray design achieves at least $\sim\!1~\textrm{(bit/s)/Hz}$ gain in severe hardware impairment cases.
Abstract:Distributed massive multiple-input multiple output (mMIMO) system for low earth orbit (LEO) satellite networks is introduced as a promising technique to provide broadband connectivity. Nevertheless, several challenges persist in implementing distributed mMIMO systems for LEO satellite networks. These challenges include providing scalable massive access implementation as the system complexity increases with network size. Another challenging issue is the asynchronous arrival of signals at the user terminals due to the different propagation delays among distributed antennas in space, which destroys the coherent transmission, and consequently degrades the system performance. In this paper, we propose a scalable distributed mMIMO system for LEO satellite networks based on dynamic user-centric clustering. Aiming to obtain scalable implementation, new algorithms for initial cooperative access, cluster selection, and cluster handover are provided. In addition, phase shift-aware precoding is implemented to compensate for the propagation delay phase shifts. The performance of the proposed user-centric distributed mMIMO is compared with two baseline configurations: the non-cooperative transmission systems, where each user connects to only a single satellite, and the full-cooperative distributed mMIMO systems, where all satellites contribute serving each user. The numerical results show the potential of the proposed distributed mMIMO system to enhance system spectral efficiency when compared to noncooperative transmission systems. Additionally, it demonstrates the ability to minimize the serving cluster size for each user, thereby reducing the overall system complexity in comparison to the full-cooperative distributed mMIMO systems.
Abstract:A precise incident wave angle estimation in aerial communication is a key enabler in sixth-generation wireless communication network. With this goal, a generic 3-dimensional (3D) channel model is analyzed for air-to-air (A2A) networks under antenna misalignment, radio frequency impairments and polarization loss. The unique aspects of each aerial node are highlighted and the few-shot learning as a model agnostic meta-learning (MAML) classifier is proposed for learning-to-learn (L2L) incident wave angle estimation by utilizing the received signal strength (RSS). Additionally, a more computationally efficient technique, first order model agnostic meta-learning (FOMAML) is implemented. It has been observed that the proposed approach reaches up to 85% training accuracy and 75.4% evaluation accuracy with MAML. Regarding this, a convergence rate and accuracy trade-off have been established for several cases of MAML and FOMAML. For different L2L models trained with limited data, heuristic accuracy performance is determined by an upper bound of the probability of confidence.