There is plenty of documentation and instructions on how to use and interpret data recorded through eye-tracking equipment: eye-movement metrics. These help researchers analyse the data and extract meaning from users’ actions (e.g. long fixations, fixation special density, gaze, saccades, scanpaths, transition matrixes, etc.).
The proliferation of mobile devices gave rise to different tools to record users’ interaction with the applications. Rather than relying on eye-gaze, these tools rely mostly on the record of users’ gestures. Alas, all these new tools still lack studies to provide support in interpreting the data, and researchers do not yet know what specific sequences or patterns of gestures mean.
The goal of this project is to combine the use of FUSAMI (a web-based platform to perform advanced analytics on real-time mobile applications usage data) with qualitative research to extract meaning from gesture patterns and begin defining new gesture-metrics for usability evaluation.
Initial set of gesture metrics for touchscreen gesture interaction. Identification of distinctive patterns of interaction that can be used as gesture metrics to infer user’s behaviour through remote gesture log visualization analysis.
For any additional information regarding this project, please contact us using the inquiries form.