Abstract:Managing Type 1 Diabetes (T1D) demands constant vigilance as individuals strive to regulate their blood glucose levels to avert the dangers of dysglycemia (hyperglycemia or hypoglycemia). Despite the advent of sophisticated technologies such as automated insulin delivery (AID) systems, achieving optimal glycemic control remains a formidable task. AID systems integrate continuous subcutaneous insulin infusion (CSII) and continuous glucose monitors (CGM) data, offering promise in reducing variability and increasing glucose time-in-range. However, these systems often fail to prevent dysglycemia, partly due to limitations in prediction algorithms that lack the precision to avert abnormal glucose events. This gap highlights the need for proactive behavioral adjustments. We address this need with GLIMMER, Glucose Level Indicator Model with Modified Error Rate, a machine learning approach for forecasting blood glucose levels. GLIMMER categorizes glucose values into normal and abnormal ranges and devises a novel custom loss function to prioritize accuracy in dysglycemic events where patient safety is critical. To evaluate the potential of GLIMMER for T1D management, we both use a publicly available dataset and collect new data involving 25 patients with T1D. In predicting next-hour glucose values, GLIMMER achieved a root mean square error (RMSE) of 23.97 (+/-3.77) and a mean absolute error (MAE) of 15.83 (+/-2.09) mg/dL. These results reflect a 23% improvement in RMSE and a 31% improvement in MAE compared to the best-reported error rates.