Fusion of Interest Point/Image based descriptors for efficient person re-identification

Mohamed Ibn Khedher, Houda Jmila, Mounim A. El Yacoubi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The paper proposes a novel video-based person re-identification system that consists of describing a person using both Interest Points (IP) and Image-based features. The Image-based descriptor extracts global image representation that includes the silhouette but also possibly extra objects (i.e animal, stroller, etc) while the IP-based descriptor extracts salient points associated each with a local region of one of the objects. Two reidentification systems are proposed: An IP-based system using SURF interest points matched via sparse representation, and Image-based system using a Convolutional Neural Network. To harness both representations, we propose a fusing strategy based on the scores product rule, the scores being vote vectors associated with each descriptor for each person. Our proposal is evaluated on the large public dataset PRID-2011 and the results show its effectiveness compared to the state of the art.

Original languageEnglish
Title of host publication2018 International Joint Conference on Neural Networks, IJCNN 2018 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781509060146
DOIs
Publication statusPublished - 10 Oct 2018
Externally publishedYes
Event2018 International Joint Conference on Neural Networks, IJCNN 2018 - Rio de Janeiro, Brazil
Duration: 8 Jul 201813 Jul 2018

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2018-July

Conference

Conference2018 International Joint Conference on Neural Networks, IJCNN 2018
Country/TerritoryBrazil
CityRio de Janeiro
Period8/07/1813/07/18

Fingerprint

Dive into the research topics of 'Fusion of Interest Point/Image based descriptors for efficient person re-identification'. Together they form a unique fingerprint.

Cite this