Abstract:In this paper, optimal linear precoding for the multibeam geostationary earth orbit (GEO) satellite with the multi-user (MU) multiple-input-multiple-output (MIMO) downlink scenario is addressed. Multiple-user interference is one of the major issues faced by the satellites serving the multiple users operating at the common time-frequency resource block in the downlink channel. To mitigate this issue, the optimal linear precoders are implemented at the gateways (GWs). The precoding computation is performed by utilizing the channel state information obtained at user terminals (UTs). The optimal linear precoders are derived considering beamformer update and power control with an iterative per-antenna power optimization algorithm with a limited required number of iterations. The efficacy of the proposed algorithm is validated using the In-Lab experiment for 16X16 precoding with multi-beam satellite for transmitting and receiving the precoded data with digital video broadcasting satellite-second generation extension (DVB- S2X) standard for the GW and the UTs. The software defined radio platforms are employed for emulating the GWs, UTs, and satellite links. The validation is supported by comparing the proposed optimal linear precoder with full frequency reuse (FFR), and minimum mean square error (MMSE) schemes. The experimental results demonstrate that with the optimal linear precoders it is possible to successfully cancel the inter-user interference in the simulated satellite FFR link. Thus, optimal linear precoding brings gains in terms of enhanced signal-to-noise-and-interference ratio, and increased system throughput and spectral efficiency.
Abstract:This paper delves into the application of Machine Learning (ML) techniques in the realm of 5G Non-Terrestrial Networks (5G-NTN), particularly focusing on symbol detection and equalization for the Physical Broadcast Channel (PBCH). As 5G-NTN gains prominence within the 3GPP ecosystem, ML offers significant potential to enhance wireless communication performance. To investigate these possibilities, we present ML-based models trained with both synthetic and real data from a real 5G over-the-satellite testbed. Our analysis includes examining the performance of these models under various Signal-to-Noise Ratio (SNR) scenarios and evaluating their effectiveness in symbol enhancement and channel equalization tasks. The results highlight the ML performance in controlled settings and their adaptability to real-world challenges, shedding light on the potential benefits of the application of ML in 5G-NTN.