https://github.com/jdjdli/Denoise_distill_EF_gazetracker.
This paper tackles the problem of passive gaze estimation using both event and frame data. Considering inherently different physiological structures, it's intractable to accurately estimate purely based on a given state. Thus, we reformulate the gaze estimation as the quantification of state transitions from the current state to several prior registered anchor states. Technically, we propose a two-stage learning-based gaze estimation framework to divide the whole gaze estimation process into a coarse-to-fine process of anchor state selection and final gaze location. Moreover, to improve generalization ability, we align a group of local experts with a student network, where a novel denoising distillation algorithm is introduced to utilize denoising diffusion technique to iteratively remove inherent noise of event data. Extensive experiments demonstrate the effectiveness of the proposed method, which greatly surpasses state-of-the-art methods by a large extent of 15$\%$. The code will be publicly available at