Real-time visual pose estimation: from BOP objects to custom drone — A journey

Thomas Rey, Julien Moras, Alexandre Eudes, Antoine Manzanera

Research output: Contribution to journalArticlepeer-review

Abstract

Pose estimation plays a crucial role in robotics for prehension tasks or in augmented-reality application, yet its application on real-world far-range estimation has not been thoroughly studied. This study aims to evaluate pose estimators on a custom drone at distances from 0.5 m to 10 m, which is beyond the scope of existing datasets, that only contain objects close to less than 2 m. We created synthetic and real databases specific to our drone and compared various RGB pose estimators, evaluating their performance across different distances. PViT-6D, being one of the SoTA methods on the classic [0,2] m interval, also outperforms others estimators at greater distances, and proves robust with respect to detection inaccuracy. The results demonstrate the potential of PViT-6D to be used on a real time application embedded in the drone platform. This work aims to evaluate the potential of pose estimators for mutual perception and communication within a drone swarm.

Original languageEnglish
Article number103339
JournalMechatronics
Volume109
DOIs
Publication statusPublished - 1 Aug 2025

Keywords

  • Computer Vision
  • Drone swarms
  • Far range
  • Monocular
  • Pose estimation

Fingerprint

Dive into the research topics of 'Real-time visual pose estimation: from BOP objects to custom drone — A journey'. Together they form a unique fingerprint.

Cite this