Hand and arm gestures detection provides a natural way of human computer interaction. The ability to efficiently detect gestures can be applied not only to consumer electronics control (e.g. controlling the smartphone, tablet or smart TV), but also to the detection of functional motor activities for physical and cognitive rehabilitation and fall prevention.
Surface electromyography (EMG) is a technique for evaluating and recording the electrical activity produced by skeletal muscles. EMG signals can therefore be used to detect arm movements and recognize hand gestures. However, due to some problems inherent to the EMG measurements, the number of gestures which can be discriminated is still limited.
An inertial measurement unit (IMU), including a 3-axis accelerometer, a 3-axis gyroscope and (an optional) 3-axis magnetometer, enables the measurement of motions and rotations and, using specific sensor fusion techniques, it can also provide information about the arm segment orientation with increased accuracy. IMU signals can therefore be used to recognize gestures and evaluate the quality of movements.
Considering the complementary features of EMG and IMU, their data fusion can be used to increase the number of hand, wrist and forearm gestures that can be discriminated with increased accuracy.
Multi-channel surface EMG and IMU sensors attached to the arm were used to detect hand, wrist and forearm gestures. A data fusion algorithm based on the complementary features of EMG and IMU was developed so that an increased number of gestures could be detected. Gesture recognition finds application in many areas, including gesture-based interaction for people with disabilities and rehabilitation.
For any additional information regarding this project, please contact us using the inquiries form.