Abstract:This paper introduces and evaluates a hybrid technique that fuses efficiently the eye-tracking principles of photosensor oculography (PSOG) and video oculography (VOG). The main concept of this novel approach is to use a few fast and power-economic photosensors as the core mechanism for performing high speed eye-tracking, whereas in parallel, use a video sensor operating at low sampling-rate (snapshot mode) to perform dead-reckoning error correction when sensor movements occur. In order to evaluate the proposed method, we simulate the functional components of the technique and present our results in experimental scenarios involving various combinations of horizontal and vertical eye and sensor movements. Our evaluation shows that the developed technique can be used to provide robustness to sensor shifts that otherwise could induce error larger than 5 deg. Our analysis suggests that the technique can potentially enable high speed eye-tracking at low power profiles, making it suitable to be used in emerging head-mounted devices, e.g. AR/VR headsets.
Abstract:This paper presents a renewed overview of photosensor oculography (PSOG), an eye-tracking technique based on the principle of using simple photosensors to measure the amount of reflected (usually infrared) light when the eye rotates. Photosensor oculography can provide measurements with high precision, low latency and reduced power consumption, and thus it appears as an attractive option for performing eye-tracking in the emerging head-mounted interaction devices, e.g. augmented and virtual reality (AR/VR) headsets. In our current work we employ an adjustable simulation framework as a common basis for performing an exploratory study of the eye-tracking behavior of different photosensor oculography designs. With the performed experiments we explore the effects from the variation of some basic parameters of the designs on the resulting accuracy and cross-talk, which are crucial characteristics for the seamless operation of human-computer interaction applications based on eye-tracking. Our experimental results reveal the design trade-offs that need to be adopted to tackle the competing conditions that lead to optimum performance of different eye-tracking characteristics. We also present the transformations that arise in the eye-tracking output when sensor shifts occur, and assess the resulting degradation in accuracy for different combinations of eye movements and sensor shifts.