Optimal exponential bounds for aggregation of estimators for the Kullback-Leibler loss

Research output: Contribution to journalArticlepeer-review

Abstract

We study the problem of aggregation of estimators with respect to the Kullback-Leibler divergence for various probabilistic models. Rather than considering a convex combination of the initial estimators f1,⋯, fN, our aggregation procedures rely on the convex combination of the logarithms of these functions. The first method is designed for probability density estimation as it gives an aggregate estimator that is also a proper density function, whereas the second method concerns spectral density estimation and has no such mass-conserving feature. We select the aggregation weights based on a penalized maximum likelihood criterion. We give sharp oracle inequalities that hold with high probability, with a remainder term that is decomposed into a bias and a variance part. We also show the optimality of the remainder terms by providing the corresponding lower bound results.

Original languageEnglish
Pages (from-to)2258-2294
Number of pages37
JournalElectronic Journal of Statistics
Volume11
Issue number1
DOIs
Publication statusPublished - 1 Jan 2017

Keywords

  • Aggregation
  • Kullback-Leibler divergence
  • Probability density estimation
  • Sharp oracle inequality
  • Spectral density estimation

Fingerprint

Dive into the research topics of 'Optimal exponential bounds for aggregation of estimators for the Kullback-Leibler loss'. Together they form a unique fingerprint.

Cite this