Evidential network-based multimodal fusion for fall detection

Paulo Armando Cavalcante Aguilar, Jerome Boudy, Dan Istrate, Hamid Medjahed, Bernadette Dorizzi, João Cesar Moura Mota, Jean Louis Baldinger, Toufik Guettari, Imad Belfeki

Research output: Contribution to journalArticlepeer-review

Abstract

The multi-sensor fusion can provide more accurate and reliable information compared to information from each sensor separately taken. Moreover, the data from multiple heterogeneous sensors present in the medical surveillance systems have different degrees of uncertainty. Among multi-sensor data fusion techniques, Bayesian methods and Evidence theories such as Dempster-Shafer Theory (DST) are commonly used to handle the degree of uncertainty in the fusion processes. Based on a graphic representation of the DST called Evidential Networks, we propose a structure of heterogeneous multi-sensor fusion for falls detection. The proposed Evidential Network (EN) can handle the uncertainty present in a mobile and a fixed sensor-based remote monitoring systems (fall detection) by fusing them and therefore increasing the fall detection sensitivity compared to the a separated system alone.

Original languageEnglish
Pages (from-to)46-60
Number of pages15
JournalInternational Journal of E-Health and Medical Communications
Volume4
Issue number1
DOIs
Publication statusPublished - 1 Jan 2013

Keywords

  • Dempster-shafer theory
  • Evidential networks
  • Fall detection
  • Multi-sensor fusion
  • Remote

Fingerprint

Dive into the research topics of 'Evidential network-based multimodal fusion for fall detection'. Together they form a unique fingerprint.

Cite this