Abstract:5G cellular networks are designed to support a new range of applications not supported by previous standards. Among these, ultra-reliable low-latency communication (URLLC) applications are arguably the most challenging. URLLC service requires the user equipment (UE) to be able to transmit its data under strict latency constraints with high reliability. To address these requirements, new technologies, such as mini-slots, semi-persistent scheduling and grant-free access were introduced in 5G standards. In this work, we formulate a spatiotemporal mathematical model to evaluate the user-plane latency and reliability performance of millimetre wave (mmWave) massive multiple-input multiple-output (MIMO) URLLC with reactive and K-repetition hybrid automatic repeat request (HARQ) protocols. We derive closed-form approximate expressions for the latent access failure probability and validate them using numerical simulations. The results show that, under certain conditions, mmWave massive MIMO can reduce the failure probability by a factor of 32. Moreover, we identify that beyond a certain number of antennas there is no significant improvement in reliability. Finally, we conclude that mmWave massive MIMO alone is not enough to provide the performance guarantees required by the most stringent URLLC applications.
Abstract:With the continuous growth of machine-type devices (MTDs), it is expected that massive machine-type communication (mMTC) will be the dominant form of traffic in future wireless networks. Applications based on this technology, have fundamentally different traffic characteristics from human-to-human (H2H) communication, which involves a relatively small number of devices transmitting large packets consistently. Conversely, in mMTC applications, a very large number of MTDs transmit small packets sporadically. Therefore, conventional grant-based access schemes commonly adopted for H2H service, are not suitable for mMTC, as they incur in a large overhead associated with the channel request procedure. We propose three grant-free distributed optimization architectures that are able to significantly minimize the average power consumption of the network. The problem of physical layer (PHY) and medium access control (MAC) optimization in grant-free random access transmission is is modeled as a partially observable stochastic game (POSG) aimed at minimizing the average transmit power under a per-device delay constraint. The results show that the proposed architectures are able to achieve significantly less average latency than a baseline, while spending less power. Moreover, the proposed architectures are more robust than the baseline, as they present less variance in the performance for different system realizations.