Trust and fluency in industrial human-robot interaction - virtual reality study of human behaviour
thesisposted on 24.11.2021, 13:17 by Piotr FratczakPiotr Fratczak
As industrial automation is evolving, the notion of combining the strengths of humans and robots in Human-Robot Collaboration becomes more common. Although there are countless risks in removing physical barriers separating humans and industrial robots, robots are evolving and becoming inherently, mechanically safe and may soon become harmless to humans. However, just because humans cannot be physically harmed by robots, does not mean their wellbeing and performance cannot be influenced by them. Because of that, it is imperative to understand how human trust and collaborative fluency are affected by a robot’s changing behaviour. The main focus of this doctoral thesis is the influence of industrial robots on human behaviour. To acquire strong and natural human responses in a safe and controllable laboratory environment, this thesis uses immersive Virtual Reality (VR) Head Mounted Display (HMD) as a means of simulating the dangers of industrial robots without putting anyone in danger.
As the first knowledge contribution, this doctoral thesis used VR HMD to study the influence of an industrial robot’s predictability and change of behaviour on human responses and trust in Human-Robot Interaction (HRI) scenario. The results show that robot’s unexpected sudden movements significantly influence human’s posture, focus and trust. Furthermore, it is shown that people can naturally recover after the accident, however, robot’s postaccident actions can either jeopardise that attempt (by doing more trustviolating actions, such as suddenly changing movement trajectories), or significantly speed it up (by doing trust-recovering actions, such as apologising).
The second contribution of this thesis is focused on collaborative fluency and human adaptability to an industrial robot’s behaviour. It used VR HMD to simulate a collaborative scenario, where a robot and a human work at the same time on the same part. The results show a large difference between people’s adaptive capabilities. Some people can easily adapt to the robot’s increasing speed and work around the robot without losing any collaborative fluency. Other people fail to keep up with the robot, which significantly lowers their collaborative fluency. When the robot speeded up, some participants completely gave up on collaborative fluency and waited for the robot to finish its task and stop moving before they show any intention to work. Furthermore, it is shown that the fluent and non-fluent participants display statistically different behaviours and physiology and that they can often be identified even before their fluency decreases. This suggests that it may be possible to predict eventual drops in collaborative performance and prepare for them accordingly in an individualised way.
The final contribution studies the relationship between people’s physiology and their motion. The results show that training classification models on physiological features from two different groups of participants engaged in two different experiments generates similar outputs. Even though some motion features are less likely to be reflected by physiological features the results suggest that it should be possible to create a task-independent method of clustering participants to predict certain behaviours.
Although all the data collection was conducted in virtual reality and it is uncertain whether the same results would be obtained from experiments conducted in a real industrial environment, participants’ self-reports suggest that up to 80% of participants felt immersed and were committed to the goals of the experiments. This doctoral thesis shows not only that human mental and emotional wellbeing needs to be considered when designing HRI scenarios, but also that virtual reality is a valuable tool, which can significantly improve the way industrial environments are studied and improved.
- Mechanical, Electrical and Manufacturing Engineering