TouchSense - Complement the touch information given by the smartphone's capacitive screens with the information received by the accelerometer and gyroscope

The goal of this project is to complement the information given by the smartphones touch screen with information given by the accelerometer and gyroscope. Although using sensors to detect touches is not a new idea, as there are already research which use the smartphone accelerometer to infer which keystrokes were made on a touch screen, and use the accelerometer information to know when there was a tap on the smartphone’s screen is made even before the touchscreen detects it, this project aims to use that information to accomplish other objectives. Manly gather new information on a touch event in order to extend the smartphone’s touch capabilities with data such as tap strength, smartphone’s holding position while taping (if it is on the users hands or laying on a hard surface) and process that information in order to be useful to the user or the developer party.

The goal of this work is to get additional data from the input of the actual smartphone’s touch screens. The current input from capacitive screens is very accurate detecting where the user taps but lacks on getting more information about the performed action. This work focuses on extracting those additional features, such as strength of the tap, if either the user is tapping holding the phone with his own hands or using it in a still stand, and more. Despite this information could be perceived as trivial at first it could be useful to improve the overall user experience, for example adapting an interface depending in which way the user is holding the device or by detecting frustration of the user taps, if the user is acting too frustrated the interface could be programmed to show helpful hints.

 

For any additional information regarding this project, please contact us using the inquiries form.