Abstract:Efficient Random Access (RA) is critical for enabling reliable communication in Industrial Internet of Things (IIoT) networks. Herein, we propose a deep reinforcement learning based distributed RA scheme, entitled Neural Network-Based Bandit (NNBB), for the IIoT alarm scenario. In such a scenario, the devices may detect a common critical event, and the goal is to ensure the alarm information is delivered successfully from at least one device. The proposed NNBB scheme is implemented at each device, where it trains itself online and establishes implicit inter-device coordination to achieve the common goal. Devices can transmit simultaneously on multiple orthogonal channels and each possible transmission pattern constitutes a possible action for the NNBB, which uses a deep neural network to determine the action. Our simulation results show that as the number of devices in the network increases, so does the performance gain of the NNBB compared to the Multi-Armed Bandit (MAB) RA benchmark. For instance, NNBB experiences a 7% success rate drop when there are four channels and the number of devices increases from 10 to 60, while MAB faces a 25% drop.
Abstract:Low-cost, resource-constrained, maintenance-free, and energy-harvesting (EH) Internet of Things (IoT) devices, referred to as zero-energy devices (ZEDs), are rapidly attracting attention from industry and academia due to their myriad of applications. To date, such devices remain primarily unsupported by modern IoT connectivity solutions due to their intrinsic fabrication, hardware, deployment, and operation limitations, while lacking clarity on their key technical enablers and prospects. Herein, we address this by discussing the main characteristics and enabling technologies of ZEDs within the next generation of mobile networks, specifically focusing on unconventional EH sources, multi-source EH, power management, energy storage solutions, manufacturing material and practices, backscattering, and low-complexity receivers. Moreover, we highlight the need for lightweight and energy-aware computing, communication, and scheduling protocols, while discussing potential approaches related to TinyML, duty cycling, and infrastructure enablers like radio frequency wireless power transfer and wake-up protocols. Challenging aspects and open research directions are identified and discussed in all the cases. Finally, we showcase an experimental ZED proof-of-concept related to ambient cellular backscattering.
Abstract:Spatially correlated device activation is a typical feature of the Internet of Things (IoT). This motivates the development of channel scheduling (CS) methods that mitigate device collisions efficiently in such scenarios, which constitutes the scope of this work. Specifically, we present a quadratic program (QP) formulation for the CS problem considering the joint activation probabilities among devices. This formulation allows the devices to stochastically select the transmit channels, thus, leading to a soft-clustering approach. We prove that the optimal QP solution can only be attained when it is transformed into a hard-clustering problem, leading to a pure integer QP, which we transform into a pure integer linear program (PILP). We leverage the branch-and-cut (B&C) algorithm to solve PILP optimally. Due to the high computational cost of B&C, we resort to some sub-optimal clustering methods with low computational costs to tackle the clustering problem in CS. Our findings demonstrate that the CS strategy, sourced from B&C, significantly outperforms those derived from sub-optimal clustering methods, even amidst increased device correlation.
Abstract:Technology solutions must effectively balance economic growth, social equity, and environmental integrity to achieve a sustainable society. Notably, although the Internet of Things (IoT) paradigm constitutes a key sustainability enabler, critical issues such as the increasing maintenance operations, energy consumption, and manufacturing/disposal of IoT devices have long-term negative economic, societal, and environmental impacts and must be efficiently addressed. This calls for self-sustainable IoT ecosystems requiring minimal external resources and intervention, effectively utilizing renewable energy sources, and recycling materials whenever possible, thus encompassing energy sustainability. In this work, we focus on energy-sustainable IoT during the operation phase, although our discussions sometimes extend to other sustainability aspects and IoT lifecycle phases. Specifically, we provide a fresh look at energy-sustainable IoT and identify energy provision, transfer, and energy efficiency as the three main energy-related processes whose harmonious coexistence pushes toward realizing self-sustainable IoT systems. Their main related technologies, recent advances, challenges, and research directions are also discussed. Moreover, we overview relevant performance metrics to assess the energy-sustainability potential of a certain technique, technology, device, or network and list some target values for the next generation of wireless systems. Overall, this paper offers insights that are valuable for advancing sustainability goals for present and future generations.
Abstract:In recent years, Artificial Intelligence (AI) and Machine learning (ML) have gained significant interest from both, industry and academia. Notably, conventional ML techniques require enormous amounts of power to meet the desired accuracy, which has limited their use mainly to high-capability devices such as network nodes. However, with many advancements in technologies such as the Internet of Things (IoT) and edge computing, it is desirable to incorporate ML techniques into resource-constrained embedded devices for distributed and ubiquitous intelligence. This has motivated the emergence of the TinyML paradigm which is an embedded ML technique that enables ML applications on multiple cheap, resource- and power-constrained devices. However, during this transition towards appropriate implementation of the TinyML technology, multiple challenges such as processing capacity optimization, improved reliability, and maintenance of learning models' accuracy require timely solutions. In this article, various avenues available for TinyML implementation are reviewed. Firstly, a background of TinyML is provided, followed by detailed discussions on various tools supporting TinyML. Then, state-of-art applications of TinyML using advanced technologies are detailed. Lastly, various research challenges and future directions are identified.