Ranking crossing scenario complexity for eHMIs testing: A virtual reality study
External human–machine interfaces (eHMIs) have the potential to benefit AV–pedestrian interactions. The majority of studies investigating eHMIs have used relatively simple traffic environments, i.e., a single pedestrian crossing in front of a single eHMI on a one-lane straight road. While this approach has proved to be efficient in providing an initial understanding of how pedestrians respond to eHMIs, it over-simplifies interactions which will be substantially more complex in real-life circumstances. A process is illustrated in a small-scale study (N = 10) to rank different crossing scenarios by level of complexity. Traffic scenarios were first developed for varying traffic density, visual complexity of the road scene, road geometry, weather and visibility conditions, and presence of distractions. These factors have been previously shown to increase difficulty and riskiness of the crossing task. The scenarios were then tested in a motion-based, virtual reality environment. Pedestrians’ perceived workload and objective crossing behaviour were measured as indirect indicators of the level of complexity of the crossing scenario. Sense of presence and simulator sickness were also recorded as a measure of the ecological validity of the virtual environment. The results indicated that some crossing scenarios were more taxing for pedestrians than others, such as those with road geometries where traffic approached from multiple directions. Further, the presence scores showed that the virtual environments experienced were found to be realistic. This paper concludes by proposing a “complex” environment to test eHMIs under more challenging crossing circumstances.
History
School
- Design and Creative Arts
Department
- Design
Published in
Multimodal Technologies and InteractionVolume
7Issue
2Publisher
MDPIVersion
- VoR (Version of Record)
Rights holder
© The authorsPublisher statement
This article is an Open Access article published by MDPI and distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).Acceptance date
2023-01-28Publication date
2023-02-02Copyright date
2023eISSN
2414-4088Publisher version
Language
- en