Feature selection with rényi min-entropy

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We consider the problem of feature selection, and we propose a new information-theoretic algorithm for ordering the features according to their relevance for classification. The novelty of our proposal consists in adopting Rényi min-entropy instead of the commonly used Shannon entropy. In particular, we adopt a notion of conditional min-entropy that has been recently proposed in the field of security and privacy, and which is strictly related to the Bayes error. We evaluate our method on two classifiers and three datasets, and we show that it compares favorably with the corresponding one based on Shannon entropy.

Original languageEnglish
Title of host publicationArtificial Neural Networks in Pattern Recognition - 8th IAPR TC3 Workshop, ANNPR 2018, Proceedings
EditorsLuca Pancioni, Edmondo Trentin, Friedhelm Schwenker
PublisherSpringer Verlag
Pages226-239
Number of pages14
ISBN (Print)9783319999777
DOIs
Publication statusPublished - 1 Jan 2018
Externally publishedYes
Event8th IAPR TC3 workshop on Artificial Neural Networks for Pattern Recognition, ANNPR 2018 - Siena, Italy
Duration: 19 Sept 201821 Sept 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11081 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference8th IAPR TC3 workshop on Artificial Neural Networks for Pattern Recognition, ANNPR 2018
Country/TerritoryItaly
CitySiena
Period19/09/1821/09/18

Fingerprint

Dive into the research topics of 'Feature selection with rényi min-entropy'. Together they form a unique fingerprint.

Cite this