Skip to main navigation Skip to search Skip to main content

Parameterization Robustness of 3D Auto-Encoders

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The generation of 3-dimensional geometric objects in the most efficient way is a thriving research topic with, for example, the development of geometric deep learning, extending classical machine learning concepts to non euclidean data such as graphs or meshes. In this short paper, we study the effect of a reparameterization on two popular mesh and point cloud neural networks in an auto-encoder mode: PointNet [QSMG16] and SpiralNet [BBP19]. Finally, we tested a modified version of PointNet that takes orientation into account (through coordinates of the normals) as a first step towards the construction of a geometric deep learning model built with a more flexible metric regarding the parameterization. The experimental results on standardized face datasets show that SpiralNet is more robust to the reparametrization than PointNet in this specific context with the proposed reparameterization.

Original languageEnglish
Title of host publicationEG 3DOR 2022 - Eurographics Workshop on 3D Object Retrieval Short Papers
EditorsDieter W. Fellner, Werner Hansmann, Werner Purgathofer, Francois Sillion
PublisherEurographics Association
Pages17-23
Number of pages7
ISBN (Electronic)9783038681748
DOIs
Publication statusPublished - 1 Jan 2022
Externally publishedYes
Event2022 Eurographics Workshop on 3D Object Retrieval, EG 3DOR 2022 - Virtual, Online, Italy
Duration: 1 Sept 20222 Sept 2022

Publication series

NameEurographics Workshop on 3D Object Retrieval, EG 3DOR
Volume2022-September
ISSN (Print)1997-0463
ISSN (Electronic)1997-0471

Conference

Conference2022 Eurographics Workshop on 3D Object Retrieval, EG 3DOR 2022
Country/TerritoryItaly
CityVirtual, Online
Period1/09/222/09/22

Fingerprint

Dive into the research topics of 'Parameterization Robustness of 3D Auto-Encoders'. Together they form a unique fingerprint.

Cite this