This work explored the requirements of accurately
and reliably predicting user intention using a deep learning
methodology when performing fine-grained movements of the
human hand. The focus was on combining a feature engineering
process with the effective capability of deep learning to further
identify salient characteristics from a biological input signal. 3
time domain features (root mean square, waveform length, and
slope sign changes) were extracted from the surface
electromyography (sEMG) signal of 17 hand and wrist
movements performed by 40 subjects. The feature data was
mapped to 6 sensor bend resistance readings from a CyberGlove
II system, representing the associated hand kinematic data.
These sensors were located at specific joints of interest on the
human hand (the thumb’s metacarpophalangeal joint, the
proximal interphalangeal joint of each finger, and the
radiocarpal joint of the wrist). All datasets were taken from
database 2 of the NinaPro online database repository. A 3-layer
long short-term memory model with dropout was developed to
predict the 6 glove sensor readings using a corresponding sEMG
feature vector as input. Initial results from trials using test data
from the 40 subjects produce an average mean squared error of
0.176. This indicates a viable pathway to follow for this
prediction method of hand movement data, although further
work is needed to optimize the model and to analyze the data with
a more detailed set of metrics.
History
School
Sport, Exercise and Health Sciences
Published in
UK Robotics and Autonomous Systems, UK-RAS 2019.
Citation
ROBINSON, C.P. ... et al., 2019. A deep adaptive framework for robust myoelectric hand movement prediction. Presented at the 2nd UK Robotics and Autonomous Systems Conference, (UK-RAS 2019), Loughborough University, 24th January.
Version
VoR (Version of Record)
Publisher statement
This work is made available according to the conditions of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) licence. Full details of this licence are available at: https://creativecommons.org/licenses/by-nc-nd/4.0/