Human physical activity monitoring is increasingly common in people’s daily lives, being applied in health areas, sports and safety. Because of their high computational power, small size and low cost, smartphones and wearable sensors are suitable to monitor user’s daily living activities. However, almost all existing systems require devices to be worn in certain positions, making them impractical for long-term activity monitoring, where a change in position can lead to less accurate results.
This thesis describes a novel algorithm to detect human activity independent of the sensor placement. Taking into account the battery consumption, only two sensors were considered: the accelerometer and the barometer, with a sample frequency of 30 and 5 Hz, respectively. The signals obtained were then divided into 5 seconds windows.
The dataset used is composed of 25 subjects, with more than 7 hours of recording. Daily living activities such as walking, running, sitting, standing, upstairs and downstairs were performed, with the smartphone worn in 12 different positions. From each window a set of statistical, temporal and spectral features were extracted and selected. During the classification process, a decision tree was trained and evaluated using a leave one user out cross validation.
The developed framework achieved an accuracy of 94.53 ± 6.82%, regardless the subject and device’s position.
The major outcome was to get a system which allows a smartphone to monitor users’ activities in a simple way, not requiring a specific position. There are many scenarios where the contributions of this thesis may be applicable, such as to monitor the elderly, as a rehabilitation tool in physiotherapy fields and also to be used by ordinary users, who just want to check their daily level of physical activity. In all cases, the independence of position is a big concern, not only to provide more comfort and usability, but also to avoid misplacement and, consequently, false positive or negative results.
For any additional information regarding this project, please contact us using the inquiries form.