Smartphone capabilities have been increasing in the last years, and many applications have been developed in order to take advantage of these capabilities. Smartphone users, knowing that there are lots of applications to ease many daily tasks, or simply to have some fun, typically have the smartphone close by. One of the capabilities that have been explored is the detection of the physical state of the smartphone, through inertial sensors embedded in the smartphone. The smartphone can detect its own orientation and even detect if it is being moved through sensors like the accelerometer, linear acceleration sensor or the gyroscope.
Given that the smartphone can detect its own physical state and the fact that users often have their smartphone close by, the opportunity of developing a new natural and intuitive way of interacting with the smartphone arises, and this opportunity is related with gestures. Using the embedded accelerometer, linear acceleration sensor or gyroscope, the smartphone can detect if the user made some movement with the hand that is holding the smartphone. The objective of this Dissertation is to develop a software framework to be used in Android applications to render the applications capable of detecting gestures performed by its users as means of interaction, sparing an Android developer the effort of implementing such functionality. The idea is to include a pre-trained set of gestures in the framework that developers can use but also be able to automatically learn new gestures through repetitive training.
The gesture recognition capability is carried out by a Hidden Markov model approach, in a user independent setting, and it was achieved an average recognition accuracy of 97.8% using the gyroscope and the linear acceleration sensor on an alphabet of 8 gestures, and an average accuracy of 85.1% using the accelerometer and the gyroscope on an alphabet of 24 gestures. Smartphone gesture recognition has been used in several research areas, as health care, monitoring systems, or user commodity. One Android application using this framework could be used, for instance, to remotely control an electronic device, or trigger an action in the smartphone.
Given the promising results that have been achieved, the next steps in terms of future work concern exploiting the developed framework in the development of a real application, taking advantage of this new interface for user interaction.
Author: Paulo Silva
Type: MSc thesis
Partner: Faculdade de Engenharia da Universidade do Porto