ePrints@IIScePrints@IISc Home | About | Browse | Latest Additions | Advanced Search | Contact | Help

Deep Learning-based Eye Gaze Estimation for Military Aviation

Murthy, LRD and Biswas, P (2022) Deep Learning-based Eye Gaze Estimation for Military Aviation. In: 2022 IEEE Aerospace Conference, AERO 2022, 5 March 2022through 12 March 2022, Big Sky.

[img] PDF
IEEE_ Aero_2022_2022 .pdf - Published Version
Restricted to Registered users only

Download (4MB) | Request a copy
Official URL: https://doi.org/10.1109/AERO53065.2022.9843506

Abstract

Eye gaze estimation and cognitive load estimation of the pilot garnered great attention in the aviation domain due to the numerous possible applications. Earlier works proposed to use eye gaze tracking to interact with multi-function displays (MFDs) and head-up display (HUD) in place of traditional interaction devices. Further, researchers also investigated the accuracy of commercially available gaze trackers during in-flight conditions by conducting studies under various actual flying scenarios like varying g-conditions and different maneuvers. In this paper, we first studied the functioning of a wearable eye gaze tracker using two one-hour long flights. Pilots undertook various challenging maneuvers during the flight. We analyzed the gaze tracking data recorded using the gaze tracker and observed that the ∼42% and ∼31% of flight duration resulted in loss of gaze data in flight1 and flight 2 respectively. Further, we analyzed unsynchronized raw data and observed that both flights recorded error-prone gaze samples for ∼51% of the flight duration. We hypothesized and verified that this loss of data is caused due to the higher levels of illumination on eyes and limited field of view provided by the gaze tracker in the vertical direction. The data from both flights supported our hypothesis and it was evident that the current field of view offered by the eye tracking glasses is not sufficient for the military aviation. We addressed the first limitation by using Machine learning approach. We built an end-to-end gaze estimation system which takes IR-eye images recorded using wearable eye tracking glasses to predict the gaze point. We sampled 10K images with proper ground truth gaze points. Our dataset contained wide variation in illumination and pupil dilation. We observed that the proposed approach using a convolutional neural network resulted in low gaze estimation errors and consistent gaze predictions.

Item Type: Conference Paper
Publication: IEEE Aerospace Conference Proceedings
Publisher: IEEE Computer Society
Additional Information: The copyright for this article belongs to IEEE Computer Society.
Keywords: Deep learning; Glass; Wearable technology, Aviation domain; Cognitive loads; Eye-gaze; Eye-tracking; Field of views; Flight duration; Gaze estimation; Gaze point; Gaze tracker; Load estimation, Eye tracking
Department/Centre: Division of Mechanical Sciences > Centre for Product Design & Manufacturing
Date Deposited: 06 Oct 2022 08:21
Last Modified: 06 Oct 2022 08:21
URI: https://eprints.iisc.ac.in/id/eprint/77146

Actions (login required)

View Item View Item