UC2018 DualMyo Hand Gesture Dataset

From MaRDI portal
Dataset:6695414



DOI10.5281/zenodo.1320922Zenodo1320922MaRDI QIDQ6695414FDOQ6695414

Dataset published at Zenodo repository.

Miguel Simão, O. Gibaru, Pedro Neto

Publication date: 25 July 2018

Copyright license: Creative Commons Attribution-ShareAlike 4.0 International



This isset of data obtained from two consumer-market EMG sensors (Myo) witha subject performs 8 distincthand gestures. There are a total of 110 repetitions of each class of gesture obtained across 5 recording sessions. Besides the data set, which is saved in a python pickle file, we include a python test script to load the data, generate random synthetic sequences of gestures and classify them with multiple models. Gesture library: Rest Closed fist Open hand Wave in Wave out Double-tap Hand down Hand up Device placement: The two Myos are placed on the forearm with the usb port pointing outwards, palm and sensor 5 facing upwards. The Myos are next to one another with their middle position close the the thickest section of the forearm. The outwards Myo is rotated slightly so that sensor 5 is aligned withthe axis of the palmaris longus tendon. The Myo inside is rotated so that it has an angle of 22.5 degrees with the first Myo, in clockwise direction (subject perspective). Acquisition protocol: The subjects wear the armbands according to the instructions above. The sensors are run for a few minutes to warm-up. The subjects are requested to hold the positions of the gestures for a few seconds while we record 2 seconds of data. The gestures are repeated in random order in several sessions.







This page was built for dataset: UC2018 DualMyo Hand Gesture Dataset