A Deep Learning-Based Approach for Camera Motion Classification

Kaouther Ouenniche, Ruxandra Tapu, Titus Zaharia

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The automatic estimation of the various types of camera motion (e.g., traveling, panning, rolling, zoom..) that are present in videos represents an important challenge for automatic video indexing. Previous research works are mainly based on optical flow estimation and analysis. In this paper, we propose a different, deep learning-based approach that makes it possible to classify the videos according to the type of camera motion. The proposed method is inspired from action recognition approaches and exploits 3D convolutional neural networks with residual blocks. The performances are objectively evaluated on challenging videos, involving blurry frames, fast/slow motion, poorly textured scenes. The accuracy rates obtained (with an average score of 94%) demonstrate the robustness of the proposed model.

Original languageEnglish
Title of host publicationProceedings of the 2021 9th European Workshop on Visual Information Processing, EUVIP 2021
EditorsA. Beghdadi, F. Alaya Cheikh, J.M.R.S. Tavares, A. Mokraoui, G. Valenzise, L. Oudre, M.A. Qureshi
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665432306
DOIs
Publication statusPublished - 23 Jun 2021
Event9th European Workshop on Visual Information Processing, EUVIP 2021 - Paris, France
Duration: 23 Jun 202125 Jun 2021

Publication series

NameProceedings - European Workshop on Visual Information Processing, EUVIP
Volume2021-June
ISSN (Print)2471-8963

Conference

Conference9th European Workshop on Visual Information Processing, EUVIP 2021
Country/TerritoryFrance
CityParis
Period23/06/2125/06/21

Keywords

  • 3D CNN
  • Camera motion classification
  • Resnet
  • deep learning

Fingerprint

Dive into the research topics of 'A Deep Learning-Based Approach for Camera Motion Classification'. Together they form a unique fingerprint.

Cite this