Expressivity of Hidden Markov Chains vs. Recurrent Neural Networks From a System Theoretic Viewpoint

Research output: Contribution to journalArticlepeer-review

Abstract

Hidden Markov Chains (HMC) and Recurrent Neural Networks (RNN) are two well known tools for predicting time series. Even though these solutions were developed independently in distinct communities, they share some similarities when considered as probabilistic structures. So in this paper we first consider HMC and RNN as generative models, and we embed both structures in a common generative unified model (GUM). We next address a comparative study of the expressivity (or modeling power) of these models, which here refers to the range of the joint probability distribution of an observations sequence, induced by the underlying latent variables. To that end we assume that the models are furthermore linear and Gaussian. The probability distributions produced by these models are characterized by structured covariance series, and as a consequence expressivity reduces to comparing sets of structured covariance series, which enables us to call for stochastic realization theory (SRT). We finally provide conditions under which a given covariance series can be realized by a GUM, an HMC or an RNN.

Original languageEnglish
Pages (from-to)4178-4191
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume71
DOIs
Publication statusPublished - 1 Jan 2023

Keywords

  • Hidden Markov Chains
  • Recurrent Neural Networks
  • expressivity
  • generative models
  • modeling power
  • stochastic realization theory

Fingerprint

Dive into the research topics of 'Expressivity of Hidden Markov Chains vs. Recurrent Neural Networks From a System Theoretic Viewpoint'. Together they form a unique fingerprint.

Cite this