Extracting behavioral motifs for characterizing human daily activities in smart environments

Research output: Chapter in book/report/conference proceedingConference contribution

Abstract

In view of the aging population and the growing need of assisted living, smart houses with basic sensors installed have been investigated to make 24-hour monitoring and tracking of the residents’ indoor activities of daily living possible. Based on the sensor data, healthcare professionals can carry out in-depth examination on residents’ activities of daily living for monitoring their health status. This paper aims to develop a computational algorithm to infer from the sensor data behavioral motifs for characterizing the residents living in such smart environments. The motifs are the over-represented event patterns exhibited by a resident when compared with others. A particular three-step approach is proposed for the motif extraction and a mixture model is adopted. We evaluated the proposed approach using the WSU CASAS dataset (which contains sensor data of 20 persons living in the smart house) and provided detailed interpretation to demonstrate how the behavioral motifs extracted could be used to characterize the residents’ behaviors. We anticipate that such a behavioral motif extraction tool can help healthcare professionals analyze human daily activities more effectively.
Original languageEnglish
Title of host publicationACM SIGKDD Workshop on Health Informatics (HI-KDD 2012)
PublisherAssociation for Computing Machinery
Publication statusPublished - Aug 2012
EventACM SIGKDD Workshop on Health Informatics (HI-KDD 2012) - Beijing, China
Duration: 12 Aug 2012 → …

Conference

ConferenceACM SIGKDD Workshop on Health Informatics (HI-KDD 2012)
Period12/08/12 → …

User-Defined Keywords

  • Activity of Daily Living
  • smart environments
  • behavioral motifs

Fingerprint

Dive into the research topics of 'Extracting behavioral motifs for characterizing human daily activities in smart environments'. Together they form a unique fingerprint.

Cite this