A view-dependent adaptivity metric for real time mesh tessellation

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Real-time tessellation methods offer the ability to upsample 3D surface meshes on the fly during rendering. This upsampling relies on 3 major steps. First, it requires a tessellation kernel which can be implemented on GPU or may be already available as a hardware unit. Second, the surface model defines the positions of the newly inserted vertices - we focus on recent visually smooth models. And third the adaptive sampling tailors the spatially varying distribution of newly inserted vertices on the input surface. We study the last component and introduce a new view-dependent adaptivity metric which builds upon both intrinsic and extrinsic criteria of the input mesh. As a result, we obtain a better vertex distribution around important features in the tessellated mesh.

Original languageEnglish
Title of host publication2010 IEEE International Conference on Image Processing, ICIP 2010 - Proceedings
Pages3969-3972
Number of pages4
DOIs
Publication statusPublished - 1 Dec 2010
Externally publishedYes
Event2010 17th IEEE International Conference on Image Processing, ICIP 2010 - Hong Kong, Hong Kong
Duration: 26 Sept 201029 Sept 2010

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference2010 17th IEEE International Conference on Image Processing, ICIP 2010
Country/TerritoryHong Kong
CityHong Kong
Period26/09/1029/09/10

Keywords

  • Adaptive geometry
  • Computer graphics
  • Meshes
  • Real time rendering
  • Tessellation

Fingerprint

Dive into the research topics of 'A view-dependent adaptivity metric for real time mesh tessellation'. Together they form a unique fingerprint.

Cite this