Passer à la navigation principale Passer à la recherche Passer au contenu principal

On stochastic gradient Langevin dynamics with dependent data streams in the logconcave case

  • M. Barkhagen
  • , N. H. Chau
  • , Moulines
  • , M. Rásonyi
  • , S. Sabanis
  • , Y. Zhang

Résultats de recherche: Contribution à un journalArticleRevue par des pairs

Résumé

We study the problem of sampling from a probability distribution π on Rd which has a density w.r.t. the Lebesgue measure known up to a normalization factor x→ e−U(x)/f Rd e−U(y) dy. We analyze a sampling method based on the Euler discretization of the Langevin stochastic differential equations under the assumptions that the potential U is continuously differentiable, ∇U is Lipschitz, and U is strongly concave. We focus on the case where the gradient of the log-density cannot be directly computed but unbiased estimates of the gradient from possibly dependent observations are available. This setting can be seen as a combination of a stochastic approximation (here stochastic gradient) type algorithms with discretized Langevin dynamics. We obtain an upper bound of the Wasserstein-2 distance between the law of the iterates of this algorithm and the target distribution π with constants depending explicitly on the Lipschitz and strong convexity constants of the potential and the dimension of the space. Finally, under weaker assumptions on U and its gradient but in the presence of independent observations, we obtain analogous results in Wasserstein-2 distance.

langue originaleAnglais
Pages (de - à)1-33
Nombre de pages33
journalBernoulli
Volume27
Numéro de publication1
Les DOIs
étatPublié - 1 févr. 2021

Empreinte digitale

Examiner les sujets de recherche de « On stochastic gradient Langevin dynamics with dependent data streams in the logconcave case ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation