State and parameter learning with PARIS particle Gibbs

  • Gabriel Cardoso
  • , Yazid Janati El Idrissi
  • , Sylvain Le Corff
  • , Éric Moulines
  • , Jimmy Olsson

Research output: Contribution to journalConference articlepeer-review

Abstract

Non-linear state-space models, also known as general hidden Markov models (HMM), are ubiquitous in statistical machine learning, being the most classical generative models for serial data and sequences. Learning in HMM, either via Maximum Likelihood Estimation (MLE) or Markov Score Climbing (MSC) requires the estimation of the smoothing expectation of some additive functionals. Controlling the bias and the variance of this estimation is crucial to establish the convergence of learning algorithms. Our first contribution is to design a novel additive smoothing algorithm, the Parisian particle Gibbs (PPG) sampler, which can be viewed as a PARIS (Olsson & Westerborn, 2017) algorithm driven by conditional SMC moves, resulting in bias-reduced estimates of the targeted quantities. We substantiate the PPG algorithm with theoretical results, including new bounds on bias and variance as well as deviation inequalities. We then establish, in the learning context, and under standard assumptions, non-asymptotic bounds highlighting the value of bias reduction and the implicit Rao-Blackwellization of PPG. These are the first non-asymptotic results of this kind in this setting. We illustrate our theoretical results with numerical experiments supporting our claims.

Original languageEnglish
Pages (from-to)3625-3675
Number of pages51
JournalProceedings of Machine Learning Research
Volume202
Publication statusPublished - 1 Jan 2023
Event40th International Conference on Machine Learning, ICML 2023 - Honolulu, United States
Duration: 23 Jul 202329 Jul 2023

Fingerprint

Dive into the research topics of 'State and parameter learning with PARIS particle Gibbs'. Together they form a unique fingerprint.

Cite this