This study is contextualized in the field of improving indoor comfort measurements, with particular focus in identifying the presence of occupants and discriminating their activities in the environment. Indeed, a non-invasive (and non-wearable) Human Activity Recognition (HAR) system, based on ultrasonic sensors (US) for monitoring activities in office environments, is presented. The methodology has been developed in a controlled environment and involved 10 participants performing predefined office activities. The raw US sensor data has undergone a pre-processing phase, which included calibration to filter out environmental noise and segmentation into 5-s windows. This phase transforms the raw distance data into meaningful displacement measurements that serve as input for classification models. The root mean square temporal marker is extracted and used as input feature in a two-stage classification approach to differentiate between low-intensity and high-intensity activities. Specifically, to increase the accuracy of activity detection, a two-stage classification approach involving both machine learning and deep learning techniques has been developed. In the first stage, the support vector machine classifier distinguishes between static and dynamic office activities with 93.1 % accuracy. In the second stage, a convolutional neural network further classifies specific dynamic office activities, such as writing, typing on a PC, talking on the phone, and standing, with a mean accuracy of 99.3 %. The experimental results confirm the high performance of the proposed HAR system, demonstrating the effectiveness of US sensors for reliable activities recognition. This scalable solution has the potential to enhance user comfort by integrating US-based HAR systems into personalized comfort models and optimizing resource usage in indoor environments.

Development of a non-invasive ultrasonic sensor network for the measurement of human activities in indoor environment using multi-stage classification process

Ciuffreda, Ilaria;Cosoli, Gloria;Arnesano, Marco;
2025-01-01

Abstract

This study is contextualized in the field of improving indoor comfort measurements, with particular focus in identifying the presence of occupants and discriminating their activities in the environment. Indeed, a non-invasive (and non-wearable) Human Activity Recognition (HAR) system, based on ultrasonic sensors (US) for monitoring activities in office environments, is presented. The methodology has been developed in a controlled environment and involved 10 participants performing predefined office activities. The raw US sensor data has undergone a pre-processing phase, which included calibration to filter out environmental noise and segmentation into 5-s windows. This phase transforms the raw distance data into meaningful displacement measurements that serve as input for classification models. The root mean square temporal marker is extracted and used as input feature in a two-stage classification approach to differentiate between low-intensity and high-intensity activities. Specifically, to increase the accuracy of activity detection, a two-stage classification approach involving both machine learning and deep learning techniques has been developed. In the first stage, the support vector machine classifier distinguishes between static and dynamic office activities with 93.1 % accuracy. In the second stage, a convolutional neural network further classifies specific dynamic office activities, such as writing, typing on a PC, talking on the phone, and standing, with a mean accuracy of 99.3 %. The experimental results confirm the high performance of the proposed HAR system, demonstrating the effectiveness of US sensors for reliable activities recognition. This scalable solution has the potential to enhance user comfort by integrating US-based HAR systems into personalized comfort models and optimizing resource usage in indoor environments.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11389/74015
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact