Loughborough University
Browse
Using Haptic Cues.pdf (2.03 MB)

Using haptic cues to aid nonvisual structure recognition

Download (2.03 MB)
journal contribution
posted on 2013-10-02, 13:27 authored by Caroline Jay, Robert Stevens, Roger Hubbold, Mashhuda Glencross
Retrieving information presented visually is difficult for visually disabled users. Current accessibility technologies, such as screen readers, fail to convey presentational layout or structure. Information presented in graphs or images is almost impossible to convey through speech alone. In this paper, we present the results of an experimental study investigating the role of touch (haptic) and auditory cues in aiding structure recognition when visual presentation is missing. We hypothesize that by guiding users toward nodes in a graph structure using force fields, users will find it easier to recognize overall structure. Nine participants were asked to explore simple 3D structures containing nodes (spheres or cubes) laid out in various spatial configurations and asked to identify the nodes and draw their overall structure. Various combinations of haptic and auditory feedback were explored. Our results demonstrate that haptic cues significantly helped participants to quickly recognize nodes and structure. Surprisingly, auditory cues alone did not speed up node recognition; however, when they were combined with haptics both node identification and structure recognition significantly improved. This result demonstrates that haptic feedback plays an important role in enabling people to recall spatial layout.

History

School

  • Science

Department

  • Computer Science

Citation

JAY, C. ... et al., 2008. Using haptic cues to aid nonvisual structure recognition. ACM Transactions on Applied Perception, 5 (2), 14pp.

Publisher

© ACM

Version

  • AM (Accepted Manuscript)

Publication date

2008

Notes

This article was published in the journal, ACM Transactions on Applied Perception [© ACM] and the definitive version is available at: http://dx.doi.org/10.1145/1279920.1279922

ISSN

1544-3558

Language

  • en