Generative Zero-Shot Learning for Semantic Segmentation of 3D Point Clouds

  • Bjorn Michele
  • , Alexandre Boulch
  • , Gilles Puy
  • , Maxime Bucher
  • , Renaud Marlet

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

While there has been a number of studies on Zero-Shot Learning (ZSL) for 2D images,its application to 3D data is still recent and scarce,with just a few methods limited to classification. We present the first generative approach for both ZSL and Generalized ZSL (GZSL) on 3D data,that can handle both classification and,for the first time,semantic segmentation. We show that it reaches or outperforms the state of the art on ModelNet40 classification for both inductive ZSL and inductive GZSL. For semantic segmentation,we created three benchmarks for evaluating this new ZSL task,using S3DIS,ScanNet and SemanticKITTI. Our experiments show that our method outperforms strong baselines,which we additionally propose for this task.

Original languageEnglish
Title of host publicationProceedings - 2021 International Conference on 3D Vision, 3DV 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages992-1002
Number of pages11
ISBN (Electronic)9781665426886
DOIs
Publication statusPublished - 1 Jan 2021
Event9th International Conference on 3D Vision, 3DV 2021 - Virtual, Online, United Kingdom
Duration: 1 Dec 20213 Dec 2021

Publication series

NameProceedings - 2021 International Conference on 3D Vision, 3DV 2021

Conference

Conference9th International Conference on 3D Vision, 3DV 2021
Country/TerritoryUnited Kingdom
CityVirtual, Online
Period1/12/213/12/21

Keywords

  • 3D data
  • Frugal learning
  • GZSL
  • Generalized Zero Shot learning
  • Point cloud
  • Semantic segmentation
  • Transfer learning
  • ZSS
  • Zero Shot Semantic Segmentation
  • Zero-Shot learning

Fingerprint

Dive into the research topics of 'Generative Zero-Shot Learning for Semantic Segmentation of 3D Point Clouds'. Together they form a unique fingerprint.

Cite this