Dynamic real-time deformations using space & time adaptive sampling

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper presents a robust, adaptive method for animating dynamic visco-elastic deformable objects that provides a guaranteed frame rate. Our approach uses a novel automatic space and time adaptive level of detail technique, in combination with a large-displacement (Green) strain tensor formulation. The body is partitioned in a non-nested multiresolution hierarchy of tetrahedral meshes. The local resolution is determined by a quality condition that indicates where and when the resolution is too coarse. As the object moves and deforms, the sampling is refined to concentrate the computational load into the regions that deform the most. Our model consists of a continuous differential equation that is solved using a local explicit finite element method. We demonstrate that our adaptive Green strain tensor formulation suppresses unwanted artifacts in the dynamic behavior, compared to adaptive mass-spring and other adaptive approaches. In particular, damped elastic vibration modes are shown to be nearly unchanged for several levels of refinement. Results are presented in the context of a virtual reality system. The user interacts in real-time with the dynamic object through the control of a rigid tool, attached to a haptic device driven with forces derived from the method.

Original languageEnglish
Pages31-36
Number of pages6
Publication statusPublished - 1 Jan 2001
EventComputer Graphics Annual Conference (SIGGRAPH 2001) - Los Angeles, CA, United States
Duration: 12 Aug 200117 Aug 2001

Conference

ConferenceComputer Graphics Annual Conference (SIGGRAPH 2001)
Country/TerritoryUnited States
CityLos Angeles, CA
Period12/08/0117/08/01

Fingerprint

Dive into the research topics of 'Dynamic real-time deformations using space & time adaptive sampling'. Together they form a unique fingerprint.

Cite this