UHMAN – Unsupervised Human Motion Annotation


Every day, each of us is generating data from wearable and smartphones’ sensors. This data could be used, but it is unlabelled or wrongly labelled.

Physical inactivity is one of the main causes of several health diseases, as heart diseases, besides being correlated to overweight and obesity. On the other hand, the practice of physical exercises may increase cardio-respiratory and muscular fitness, functional health, improving bones and joint health, and cognitive functions. With this, human physical activities recognition has been increasingly sought, in order to give the necessary motivation to physical activities practice. Moreover, besides healthcare, human physical activities recognition has applications in sports, elderly oversee and safety.

The development of a human activities recognition algorithm involves the collection of a large amount of labelled data, where a high volume of data will increase the algorithm performance. Most of the time, the annotation of data labels is too expensive, time consuming or difficult to obtain. Moreover, this ground truth information may not be available. With this, the development of an unsupervised annotation method is challenging and particularly interesting within an exploratory machine learning context, in order to automatically label a large amount of unlabelled data.

As so, the aim of this project is the development of a framework for unsupervised human activities recognition using smartphone and wearable sensors.



Novel unsupervised machine learning algorithms to annotate human motion data from smartphone or wearable sensors signals. This work will ease the process of data acquisition and annotation, which most of the time is too expensive, time consuming or difficult to obtain. The result of this project will be applied in several AICOS projects, especially the ones that use machine learning techniques applied to time series data.


Author: Patrícia Bota

Type: MSc thesis

Partner: Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa

Year: 2018