posted on 2007-09-27, 11:21authored byFangmin Shi, Alastair G. Gale, Kevin Purdy
This paper describes a new control system interface which utilises
the user’s eye gaze to enable severely disabled individuals control electronic
devices easily. The system is based upon a novel human computer interface,
which facilitates simple control of electronic devices by predicting and
responding to the user’s possible intentions, based intuitively upon their point
of gaze. The interface responds by automatically pre-selecting and offering only
those controls appropriate to the specific device that the user looks at, in a
simple and accessible manner. It therefore affords the user conscious choice of
the appropriate range of control actions required, which may be executed by
simple means and without the need to navigate manually through potentially
complex control menus to reach them. Two systems using the head-mounted
and the remote eye tracker respectively are introduced, compared and evaluated
in this paper.
History
School
Science
Department
Computer Science
Citation
SHI, F., GALE, A.G. and PURDY, K. (2007). A new gaze-based interface for environmental control. IN: Universal Access in Human-Computer Interaction. Ambient Interaction. 4th International Conference on Universal Access in Human-Computer Interaction, held as part of HCI International, Beijing, China, July 22-27, Proceedings, Part II. Lecture notes in computer science, 4555. Berlin : Springer Verlag, pp. 996-1005