Abstract:This study explores the use of historical data from Global Navigation Satellite System (GNSS) scintillation monitoring receivers to predict the severity of amplitude scintillation, a phenomenon where electron density irregularities in the ionosphere cause fluctuations in GNSS signal power. These fluctuations can be measured using the S4 index, but real-time data is not always available. The research focuses on developing a machine learning (ML) model that can forecast the intensity of amplitude scintillation, categorizing it into low, medium, or high severity levels based on various time and space-related factors. Among six different ML models tested, the XGBoost model emerged as the most effective, demonstrating a remarkable 77% prediction accuracy when trained with a balanced dataset. This work underscores the effectiveness of machine learning in enhancing the reliability and performance of GNSS signals and navigation systems by accurately predicting amplitude scintillation severity.