Abstract:The immense volume of data generated by Earth observation (EO) satellites presents significant challenges in transmitting it to Earth over rate-limited satellite-to-ground communication links. This paper presents an efficient downlink framework for multi-spectral satellite images, leveraging adaptive transmission techniques based on pixel importance and link capacity. By integrating semantic communication principles, the framework prioritizes critical information, such as changed multi-spectral pixels, to optimize data transmission. The process involves preprocessing, assessing pixel importance to encode only significant changes, and dynamically adjusting transmissions to match channel conditions. Experimental results on the real dataset and simulated link demonstrate that the proposed approach ensures high-quality data delivery while significantly reducing number of transmitted data, making it highly suitable for satellite-based EO applications.
Abstract:This work investigates the coexistence of sensing and communication functionalities in a base station (BS) serving a communication user in the uplink and simultaneously detecting a radar target with the same frequency resources. To address inter-functionality interference, we employ rate-splitting (RS) at the communication user and successive interference cancellation (SIC) at the joint radar-communication receiver at the BS. This approach is motivated by RS's proven effectiveness in mitigating inter-user interference among communication users. Building on the proposed system model based on RS, we derive inner bounds on performance in terms of ergodic data information rate for communication and ergodic radar estimation information rate for sensing. Additionally, we present a closed-form solution for the optimal power split in RS that maximizes the communication user's performance. The bounds achieved with RS are compared to conventional methods, including spectral isolation and full spectral sharing with SIC. We demonstrate that RS offers a superior performance trade-off between sensing and communication functionalities compared to traditional approaches. Pertinently, while the original concept of RS deals only with digital signals, this work brings forward RS as a general method for including non-orthogonal access for sensing signals. As a consequence, the work done in this paper provides a systematic and parametrized way to effectuate non-orthogonal sensing and communication waveforms.
Abstract:Achieving a flexible and efficient sharing of wireless resources among a wide range of novel applications and services is one of the major goals of the sixth-generation of mobile systems (6G). Accordingly, this work investigates the performance of a real-time system that coexists with a broadband service in a frame-based wireless channel. Specifically, we consider real-time remote tracking of an information source, where a device monitors its evolution and sends updates to a base station (BS), which is responsible for real-time source reconstruction and, potentially, remote actuation. To achieve this, the BS employs a grant-free access mechanism to serve the monitoring device together with a broadband user, which share the available wireless resources through orthogonal or non-orthogonal multiple access schemes. We analyse the performance of the system with time-averaged reconstruction error, time-averaged cost of actuation error, and update-delivery cost as performance metrics. Furthermore, we analyse the performance of the broadband user in terms of throughput and energy efficiency. Our results show that an orthogonal resource sharing between the users is beneficial in most cases where the broadband user requires maximum throughput. However, sharing the resources in a non-orthogonal manner leads to a far greater energy efficiency.
Abstract:Energy efficiency and information freshness are key requirements for sensor nodes serving Industrial Internet of Things (IIoT) applications, where a sink node collects informative and fresh data before a deadline, e.g., to control an external actuator. Content-based wake-up (CoWu) activates a subset of nodes that hold data relevant for the sink's goal, thereby offering an energy-efficient way to attain objectives related to information freshness. This paper focuses on a scenario where the sink collects fresh information on top-k values, defined as data from the nodes observing the k highest readings at the deadline. We introduce a new metric called top-k Query Age of Information (k-QAoI), which allows us to characterize the performance of CoWu by considering the characteristics of the physical process. Further, we show how to select the CoWu parameters, such as its timing and threshold, to attain both information freshness and energy efficiency. The numerical results reveal the effectiveness of the CoWu approach, which is able to collect top-k data with higher energy efficiency while reducing k-QAoI when compared to round-robin scheduling, especially when the number of nodes is large and the required size of k is small.
Abstract:The integration of Non-Terrestrial Networks (NTNs) with Low Earth Orbit (LEO) satellite constellations into 5G and Beyond is essential to achieve truly global connectivity. A distinctive characteristic of LEO mega-constellations is that they constitute a global infrastructure with predictable dynamics, which enables the pre-planned allocation of the radio resources. However, the different bands that can be used for ground-to-satellite communication are affected differently by atmospheric conditions such as precipitation, which introduces uncertainty on the attenuation of the communication links at high frequencies. Based on this, we present a compelling case for applying integrated sensing and communications (ISAC) in heterogeneous and multi-layer LEO satellite constellations over wide areas. Specifically, we present an ISAC framework and frame structure to accurately estimate the attenuation in the communication links due to precipitation, with the aim of finding the optimal serving satellites and resource allocation for downlink communication with users on ground. The results show that, by dedicating an adequate amount of resources for sensing and solving the association and resource allocation problems jointly, it is feasible to increase the average throughput by 59% and the fairness by 600% when compared to solving these problems separately.
Abstract:This paper introduces a full solution for decentralized routing in Low Earth Orbit satellite constellations based on continual Deep Reinforcement Learning (DRL). This requires addressing multiple challenges, including the partial knowledge at the satellites and their continuous movement, and the time-varying sources of uncertainty in the system, such as traffic, communication links, or communication buffers. We follow a multi-agent approach, where each satellite acts as an independent decision-making agent, while acquiring a limited knowledge of the environment based on the feedback received from the nearby agents. The solution is divided into two phases. First, an offline learning phase relies on decentralized decisions and a global Deep Neural Network (DNN) trained with global experiences. Then, the online phase with local, on-board, and pre-trained DNNs requires continual learning to evolve with the environment, which can be done in two different ways: (1) Model anticipation, where the predictable conditions of the constellation are exploited by each satellite sharing local model with the next satellite; and (2) Federated Learning (FL), where each agent's model is merged first at the cluster level and then aggregated in a global Parameter Server. The results show that, without high congestion, the proposed Multi-Agent DRL framework achieves the same E2E performance as a shortest-path solution, but the latter assumes intensive communication overhead for real-time network-wise knowledge of the system at a centralized node, whereas ours only requires limited feedback exchange among first neighbour satellites. Importantly, our solution adapts well to congestion conditions and exploits less loaded paths. Moreover, the divergence of models over time is easily tackled by the synergy between anticipation, applied in short-term alignment, and FL, utilized for long-term alignment.
Abstract:The traditional role of the network layer is the transfer of packet replicas from source to destination through intermediate network nodes. We present a generative network layer that uses Generative AI (GenAI) at intermediate or edge network nodes and analyze its impact on the required data rates in the network. We conduct a case study where the GenAI-aided nodes generate images from prompts that consist of substantially compressed latent representations. The results from network flow analyses under image quality constraints show that the generative network layer can achieve an improvement of more than 100% in terms of the required data rate.
Abstract:The amount of data generated by Earth observation satellites can be enormous, which poses a great challenge to the satellite-to-ground connections with limited rate. This paper considers problem of efficient downlink communication of multi-spectral satellite images for Earth observation using change detection. The proposed method for image processing consists of the joint design of cloud removal and change encoding, which can be seen as an instance of semantic communication, as it encodes important information, such as changed multi-spectral pixels (MPs), while aiming to minimize energy consumption. It comprises a three-stage end-to-end scoring mechanism that determines the importance of each MP before deciding its transmission. Specifically, the sensing image is (1) standardized, (2) passed through a high-performance cloud filtering via the Cloud-Net model, and (3) passed to the proposed scoring algorithm that uses Change-Net to identify MPs that have a high likelihood of being changed, compress them and forward the result to the ground station. The experimental results indicate that the proposed framework is effective in optimizing energy usage while preserving high-quality data transmission in satellite-based Earth observation applications.
Abstract:The integration of Low Earth Orbit (LEO) satellite constellations into 5G and Beyond is essential to achieve efficient global connectivity. As LEO satellites are a global infrastructure with predictable dynamics, a pre-planned fair and load-balanced allocation of the radio resources to provide efficient downlink connectivity over large areas is an achievable goal. In this paper, we propose a distributed and a global optimal algorithm for satellite-to-cell resource allocation with multiple beams. These algorithms aim to achieve a fair allocation of time-frequency resources and beams to the cells based on the number of users in connected mode (i.e., registered). Our analyses focus on evaluating the trade-offs between average per-user throughput, fairness, number of cell handovers, and computational complexity in a downlink scenario with fixed cells, where the number of users is extracted from a population map. Our results show that both algorithms achieve a similar average per-user throughput. However, the global optimal algorithm achieves a fairness index over 0.9 in all cases, which is more than twice that of the distributed algorithm. Furthermore, by correctly setting the handover cost parameter, the number of handovers can be effectively reduced by more than 70% with respect to the case where the handover cost is not considered.
Abstract:The widespread adoption of Reconfigurable Intelligent Surfaces (RISs) in future practical wireless systems is critically dependent on the design and implementation of efficient access protocols, an issue that has received less attention in the research literature. In this paper, we propose a grant-free random access (RA) protocol for a RIS-assisted wireless communication setting, where a massive number of users' equipment (UEs) try to access an access point (AP). The proposed protocol relies on a channel oracle, which enables the UEs to infer the best RIS configurations that provide opportunistic access to UEs. The inference is based on a model created during a training phase with a greatly reduced set of RIS configurations. Specifically, we consider a system whose operation is divided into three blocks: i) a downlink training block, which trains the model used by the oracle, ii) an uplink access block, where the oracle infers the best access slots, and iii) a downlink acknowledgment block, which provides feedback to the UEs that were successfully decoded by the AP during access. Numerical results show that the proper integration of the RIS into the protocol design is able to increase the expected end-to-end throughput by approximately 40% regarding the regular repetition slotted ALOHA protocol.