Abstract:In recent years, the evolution of Telecom towards achieving intelligent, autonomous, and open networks has led to an increasingly complex Telecom Software system, supporting various heterogeneous deployment scenarios, with multi-standard and multi-vendor support. As a result, it becomes a challenge for large-scale Telecom software companies to develop and test software for all deployment scenarios. To address these challenges, we propose a framework for Automated Test Generation for large-scale Telecom Software systems. We begin by generating Test Case Input data for test scenarios observed using a time-series Generative model trained on historical Telecom Network data during field trials. Additionally, the time-series Generative model helps in preserving the privacy of Telecom data. The generated time-series software performance data are then utilized with test descriptions written in natural language to generate Test Script using the Generative Large Language Model. Our comprehensive experiments on public datasets and Telecom datasets obtained from operational Telecom Networks demonstrate that the framework can effectively generate comprehensive test case data input and useful test code.
Abstract:In today's hyper-connected world, ensuring the reliability of telecom networks becomes increasingly crucial. Telecom networks encompass numerous underlying and intertwined software and hardware components, each providing different functionalities. To ensure the stability of telecom networks, telecom software, and hardware vendors developed several methods to detect any aberrant behavior in telecom networks and enable instant feedback and alerts. These approaches, although powerful, struggle to generalize due to the unsteady nature of the software-intensive embedded system and the complexity and diversity of multi-standard mobile networks. In this paper, we present a system to detect anomalies in telecom networks using a generative AI model. We evaluate several strategies using diffusion models to train the model for anomaly detection using multivariate time-series data. The contributions of this paper are threefold: (i) A proposal of a framework for utilizing diffusion models for time-series anomaly detection in telecom networks, (ii) A proposal of a particular Diffusion model architecture that outperforms other state-of-the-art techniques, (iii) Experiments on a real-world dataset to demonstrate that our model effectively provides explainable results, exposing some of its limitations and suggesting future research avenues to enhance its capabilities further.