Fast and Consistent Learning of Hidden Markov Models by Incorporating Non-Consecutive Correlations

  • Robert Mattila
  • , Cristian R. Rojas
  • , Eric Moulines
  • , Vikram Krishnamurthy
  • , Bo Wahlberg

Research output: Contribution to journalConference articlepeer-review

Abstract

Can the parameters of a hidden Markov model (HMM) be estimated from a single sweep through the observations – and additionally, without being trapped at a local optimum in the likelihood surface? That is the premise of recent method of moments algorithms devised for HMMs. In these, correlations between consecutive pair- or tripletwise observations are empirically estimated and used to compute estimates of the HMM parameters. Albeit computationally very attractive, the main drawback is that by restricting to only low order correlations in the data, information is being neglected which results in a loss of accuracy (compared to standard maximum likelihood schemes). In this paper, we propose extending these methods (both pair- and triplet-based) by also including non-consecutive correlations in a way which does not significantly increase the computational cost (which scales linearly with the number of additional lags included). We prove strong consistency of the new methods, and demonstrate an improved performance in numerical experiments on both synthetic and real-world financial time-series datasets.

Original languageEnglish
Pages (from-to)6785-6796
Number of pages12
JournalProceedings of Machine Learning Research
Volume119
Publication statusPublished - 1 Jan 2020
Event37th International Conference on Machine Learning, ICML 2020 - Virtual, Online
Duration: 13 Jul 202018 Jul 2020

Fingerprint

Dive into the research topics of 'Fast and Consistent Learning of Hidden Markov Models by Incorporating Non-Consecutive Correlations'. Together they form a unique fingerprint.

Cite this