This study proposes two novel learning rate schedulers: the Hyperbolic Learning Rate Scheduler (HyperbolicLR) and the Exponential Hyperbolic Learning Rate Scheduler (ExpHyperbolicLR). These schedulers attempt to address the inconsistent learning curves often observed in conventional schedulers when adjusting the number of epochs. By leveraging the asymptotic behavior of hyperbolic curves, the proposed schedulers maintain more consistent learning curves across varying epoch settings. The HyperbolicLR algorithm directly applies this property to the epoch-learning rate space, while the ExpHyperbolicLR maps this concept onto the exponential space of epochs and learning rates. To evaluate the performance of these schedulers, first we found the optimal hyperparameters for each scheduler on a small number of epochs, fixed these values, and compared their performance as the number of epochs increased. Our experimental results on various deep learning tasks and architectures demonstrate that both HyperbolicLR and ExpHyperbolicLR maintain more consistent performance improvements compared to conventional schedulers as the number of epochs increases. These findings suggest that our hyperbolic-based learning rate schedulers offer a more robust and efficient approach to training deep neural networks, especially in scenarios where computational resources or time constraints limit extensive hyperparameter searches.