In this work, we propose a deep learning (DL)-based approach that integrates a state-of-the-art algorithm with a time-frequency (TF) learning framework to minimize overall latency. Meeting the stringent latency requirements of 6G orthogonal time-frequency space (OTFS) systems necessitates low-latency designs. The performance of the proposed approach is evaluated under challenging conditions: low delay and Doppler resolutions caused by limited time and frequency resources, and significant interpath interference (IPI) due to poor separability of propagation paths in the delay-Doppler (DD) domain. Simulation results demonstrate that the proposed method achieves high estimation accuracy while reducing latency by approximately 55\% during the maximization process. However, a performance trade-off is observed, with a maximum loss of 3 dB at high pilot SNR values.