Sociotechnical systems are becoming more complex and increasingly automated. Although human error is now widely viewed as playing a key role in the majority of system failures, there is an increasing recognition of the oversimplification inherent in such a view. This paper examines mismatches between the procedures and automation technologies of sociotechnical systems and their operators from the viewpoint of human culture and capabilities, with a particular focus on flight deck automation. Following an introduction to culture, its sources, its measurement and its effects, the paper describes recent theories of thinking and decision-making, and the influence of culture on decisions. Problems associated with automation are presented and it is concluded that current automation systems perform as very inadequate team members, leaving the human operators or crew unprepared when failure occurs or unusual events arise.
Funding
This work was supported by the Engineering and Physical Sciences Research Council (EPSRC) [grant number EP/P504236/1; EP/P503396/1].
History
School
Mechanical, Electrical and Manufacturing Engineering
Citation
HODGSON, A., SIEMIENIUCH, C.E. and HUBBARD, E-M., 2013. Culture and the safety of complex automated sociotechnical systems. IEEE Transactions on Human-Machine Systems, 43 (6), 12 pp.
This is an Open Access Article. It is published by the IEEE under the Creative Commons Attribution 3.0 Unported Licence (CC BY). Full details of this licence are available at: http://creativecommons.org/licenses/by/3.0/