posted on 2021-02-26, 15:26authored byAchim Buerkle, Will Eaton, Niels Lohse, Tom Bamber, Pedro FerreiraPedro Ferreira
Consumer markets demonstrate an observable trend towards mass customization. Assembly processes are
required to adapt in order to meet the requirements of increased product complexity and constant variant updates. A concept to meet challenges within this trend, is a close collaboration between human workers and
robots. Currently, in order to protect human operators, there are barriers and restrictions in place which prevent
close collaboration. This is due to safety systems being mostly reactive, rather than anticipating motions or
intentions. There are probabilistic models, which aim to overcome these limitations, yet predicting human
behavior remains highly complex. Thus, it would be desirable to physically measure movement intentions in
advance. A novel approach is presented of how upper-limb movement intentions can be measured with a mobile
electroencephalogram (EEG). The human brain constantly analyses and evaluates motor movements up to 0.5 s
before their execution. A safety system could therefore be enhanced to have an early warning of an upcoming
movement. In order to classify the EEG-signals as fast as possible and to minimize fine-tuning efforts, a novel data
processing methodology is introduced. This includes TimeSeriesKMeans labelling of movement intentions, which
is then used to train a Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). The results suggested
high detection accuracies and potential time gains of up to 513 ms to be achieved in a semi-online system. Thus,
the time advantages included in a simulation demonstrated the potential to increase a system’s reaction time and
therefore improve the safety and the fluency of Human-Robot Collaboration
History
School
Mechanical, Electrical and Manufacturing Engineering
This is an Open Access Article. It is published by Elsevier under the Creative Commons Attribution 4.0 Unported Licence (CC BY). Full details of this licence are available at: http://creativecommons.org/licenses/by/4.0/