Generating shared latent variables for robots to imitate human movements and understand their physical limitations

Maxime Devanne, Sao Mai Nguyen

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Assistive robotics and particularly robot coaches may be very helpful for rehabilitation healthcare. In this context, we propose a method based on Gaussian Process Latent Variable Model (GP-LVM) to transfer knowledge between a physiotherapist, a robot coach and a patient. Our model is able to map visual human body features to robot data in order to facilitate the robot learning and imitation. In addition, we propose to extend the model to adapt the robots’ understanding to patients’ physical limitations during assessment of rehabilitation exercises. Experimental evaluation demonstrates promising results for both robot imitation and model adaptation according to patients’ limitations.

Original languageEnglish
Title of host publicationComputer Vision – ECCV 2018 Workshops, Proceedings
EditorsStefan Roth, Laura Leal-Taixé
PublisherSpringer Verlag
Pages190-197
Number of pages8
ISBN (Print)9783030110116
DOIs
Publication statusPublished - 1 Jan 2019
Event15th European Conference on Computer Vision, ECCV 2018 - Munich, Germany
Duration: 8 Sept 201814 Sept 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11130 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference15th European Conference on Computer Vision, ECCV 2018
Country/TerritoryGermany
CityMunich
Period8/09/1814/09/18

Keywords

  • Motion analysis
  • Physical rehabilitation
  • Robot imitation
  • Shared gaussian process latent variable model
  • Transfer knowledge

Fingerprint

Dive into the research topics of 'Generating shared latent variables for robots to imitate human movements and understand their physical limitations'. Together they form a unique fingerprint.

Cite this