TY - JOUR
T1 - Adaptive bayesian estimation in indirect gaussian sequence space models
AU - Johannes, Jan
AU - Simoni, Anna
AU - Schenk, Rudolf
N1 - Publisher Copyright:
© 2020 GENES (Groupe des Ecoles en Economie et Statistiques). All rights reserved.
PY - 2020/3/1
Y1 - 2020/3/1
N2 - In an indirect Gaussian sequence space model we derive lower and upper bounds for the concentration rate of the posterior distribution of the parameter of interest shrinking to the parameter value θ° that generates the data. While this establishes posterior consistency, the concentration rate depends on both θ° and a tuning parameter which enters the prior distribution. We first provide an oracle optimal choice of the tuning parameter, i.e., optimized for each θ° separately. The optimal choice of the prior distribution allows us to derive an oracle optimal concentration rate of the associated posterior distribution. Moreover, for a given class of parameters and a suitable choice of the tuning parameter, we show that the resulting uniform concentration rate over the given class is optimal in a minimax sense. Finally, we construct a hierarchical prior that is adaptive for mildly ill-posed inverse problems. This means that, given a parameter θ° or a class of parameters, the posterior distribution contracts at the oracle rate or at the minimax rate over the class, respectively. Notably, the hierarchical prior does not depend neither on θ° nor on the given class. Moreover, convergence of the fully data-driven Bayes estimator at the oracle or at the minimax rate is established. JEL Codes: C11, C14.
AB - In an indirect Gaussian sequence space model we derive lower and upper bounds for the concentration rate of the posterior distribution of the parameter of interest shrinking to the parameter value θ° that generates the data. While this establishes posterior consistency, the concentration rate depends on both θ° and a tuning parameter which enters the prior distribution. We first provide an oracle optimal choice of the tuning parameter, i.e., optimized for each θ° separately. The optimal choice of the prior distribution allows us to derive an oracle optimal concentration rate of the associated posterior distribution. Moreover, for a given class of parameters and a suitable choice of the tuning parameter, we show that the resulting uniform concentration rate over the given class is optimal in a minimax sense. Finally, we construct a hierarchical prior that is adaptive for mildly ill-posed inverse problems. This means that, given a parameter θ° or a class of parameters, the posterior distribution contracts at the oracle rate or at the minimax rate over the class, respectively. Notably, the hierarchical prior does not depend neither on θ° nor on the given class. Moreover, convergence of the fully data-driven Bayes estimator at the oracle or at the minimax rate is established. JEL Codes: C11, C14.
KW - Adaptation
KW - Bayesian Nonparametrics
KW - Exact Concentration Rates
KW - Hierarchical Bayes
KW - Minimax Theory
KW - Oracle Optimality
KW - Sieve Prior
U2 - 10.15609/ANNAECONSTAT2009.137.0083
DO - 10.15609/ANNAECONSTAT2009.137.0083
M3 - Article
AN - SCOPUS:85085497022
SN - 2115-4430
SP - 83
EP - 116
JO - Annals of Economics and Statistics
JF - Annals of Economics and Statistics
IS - 137
ER -