Loughborough University
Browse
1/1
4 files

Human-Human co-manipulation data

dataset
posted on 2020-11-27, 14:43 authored by Ali Al-Yacoub, Myles Flanagan
The shared data were filtered and manually cleaned. There are four co-manipulation trials, each one contains the following columns:
  1. 'rosbagTimestamp': Timestamp
  2. '# of samples'
  3. Force/torque signal: 'Fx', 'Fy', 'Fz', 'Tx', 'Ty', 'Tz'
  4. Object position: 'x.v', 'y.v', 'z.v', 'Rx.v', 'Ry.v', 'Rz.v', 'w.v'
  5. Follower right arm/forearm EMG: 'emg_RFsgl', 'emg_RAsgl',
  6. Object displacement (Cartisaian): 'disp_x', 'disp_y', 'disp_z'
  7. Time difference between two consequent timestamps 'step_size'
The data was collected as follows: Two humans were asked to co-manipulate a load of 10 Kg (together). One human was acting like a leader, and the other one was asked to follow. The only allowed communication between the follower and the leader is through the haptic clues (measured using 6-axis force/torque signal). The follower had EMG muscle activity sensors on the right hand (arm/forearm). Finally, the object position was tracked using an eight-camera VICON system. All sensory data were synchronised using ApproximateTime synchroniser; hence, they have the same timestamp. These datasets were collected through a GitHub tool available at: https://github.com/Intelligent-Automation-Centre/bluebox. The tool is explained in more detail in Al-Yacoub et al. (2020).

Reference:
Al-Yacoub, A, Buerkle, A, Flanagan, M, Ferreira, P, Hubbard, E, Lohse, N (2020) (Accepted for presentation) Effective Human-Robot Collaboration Through Wearable Sensors. In the 25th IEEE Conference on Emerging Technologies and Factory Automation, Vienna, Austria, 8-11 September.

Funding

Digital Toolkit for optimisation of operators and technology in manufacturing partnerships (DigiTOP)

Engineering and Physical Sciences Research Council

Find out more...

History

School

  • Mechanical, Electrical and Manufacturing Engineering