Ekambaram_2016-03-14 Digital and video analysis of eye glance movements.pdf (689.66 kB)
Digital and video analysis of eye-glance movements during naturalistic driving from the ADSEAT and TeleFOT field operational trials - results and challenges
conference contributionposted on 2016-06-14, 08:21 authored by Karthikeyan Ekambaram, James Lenard, Steven ReedSteven Reed
The EU projects ADSEAT (2009-2013) and TeleFOT (2008-2012) both included components of work involving naturalistic driving trials in instrumented vehicles. Of specific interest to this paper was the use of video recordings and digital eye-tracker readings to monitor eye-gaze behaviour. The aim of the study was to describe the results and challenges of applying these two methodologies under real-life driving conditions based on nine subjects from the ADSEAT project and ten from the TeleFOT project. It proved possible to detect the effect of navigation devices on driver attention as reflected in eye-glance behaviour through manual review of video recordings. This procedure was however very labour intensive. While the digital eye-tracker produced reliable measurements of head movements through real-time image processing and recognition of facial features, it generally failed to provide meaningful data on eye-gaze movements. There was however several minutes of remarkably accurate eye-gaze readings found within hours of recording that proved the technology could work if the experimental methodology were perfected. This potentially opens the way to cost-effective analysis of eye-gaze behaviour by the application of computerised algorithms to digital files.
Published inEuropean Conference on Human Centred Design for Intelligent Transport Systems
CitationEKAMBARAM, K., LENARD, J. and REED, S., 2016. Digital and video analysis of eye-glance movements during naturalistic driving from the ADSEAT and TeleFOT field operational trials - results and challenges. Presented at the European Conference on Human Centred Design for Intelligent Transport Systems, Loughborough, UK, 30th Jun-1st Jul.
- AM (Accepted Manuscript)
Publisher statementThis work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/
NotesThis is a conference paper.