TY - JOUR
T1 - Differentiable samplers for deep latent variable models
AU - Doucet, Arnaud
AU - Moulines, Eric
AU - Thin, Achille
N1 - Publisher Copyright:
© 2023 The Authors.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - Latent variable models are a popular class of models in statistics. Combined with neural networks to improve their expressivity, the resulting deep latent variable models have also found numerous applications in machine learning. A drawback of these models is that their likelihood function is intractable so approximations have to be carried out to perform inference. A standard approach consists of maximizing instead an evidence lower bound (ELBO) obtained based on a variational approximation of the posterior distribution of the latent variables. The standard ELBO can, however, be a very loose bound if the variational family is not rich enough. A generic strategy to tighten such bounds is to rely on an unbiased low-variance Monte Carlo estimate of the evidence. We review here some recent importance sampling, Markov chain Monte Carlo and sequential Monte Carlo strategies that have been proposed to achieve this. This article is part of the theme issue 'Bayesian inference: challenges, perspectives, and prospects'.
AB - Latent variable models are a popular class of models in statistics. Combined with neural networks to improve their expressivity, the resulting deep latent variable models have also found numerous applications in machine learning. A drawback of these models is that their likelihood function is intractable so approximations have to be carried out to perform inference. A standard approach consists of maximizing instead an evidence lower bound (ELBO) obtained based on a variational approximation of the posterior distribution of the latent variables. The standard ELBO can, however, be a very loose bound if the variational family is not rich enough. A generic strategy to tighten such bounds is to rely on an unbiased low-variance Monte Carlo estimate of the evidence. We review here some recent importance sampling, Markov chain Monte Carlo and sequential Monte Carlo strategies that have been proposed to achieve this. This article is part of the theme issue 'Bayesian inference: challenges, perspectives, and prospects'.
KW - Bayesian inference
KW - Monte Carlo methods
KW - importance sampling
KW - variational inference
U2 - 10.1098/rsta.2022.0147
DO - 10.1098/rsta.2022.0147
M3 - Review article
C2 - 36970826
AN - SCOPUS:85150985185
SN - 1364-503X
VL - 381
JO - Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
JF - Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
IS - 2247
M1 - 20220147
ER -