Markov Chains: Basic Definitions

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

Heuristically, a discrete-time stochastic process has the Markov property if the past and future are independent given the present. In this introductory chapter, we give the formal definition of a Markov chain and of the main objects related to this type of stochastic process and establish basic results. In particular, we will introduce in Section 1.2 the essential notion of a Markov kernel, which gives the distribution of the next state given the current state. In Section 1.3, we will restrict attention to time-homogeneous Markov chains and establish that a fundamental consequence of the Markov property is that the entire distribution of a Markov chain is characterized by the distribution of its initial state and a Markov kernel. In Section 1.4, we will introduce the notion of invariant measures, which play a key role in the study of the long-term behavior of a Markov chain. Finally, in Sections 1.5 and 1.6, which can be skipped on a first reading, we will introduce the notion of reversibility, which is very convenient and is satisfied by many Markov chains, and some further properties of kernels seen as operators and certain spaces of functions.

Original languageEnglish
Title of host publicationSpringer Series in Operations Research and Financial Engineering
PublisherSpringer Nature
Pages3-25
Number of pages23
DOIs
Publication statusPublished - 1 Jan 2018

Publication series

NameSpringer Series in Operations Research and Financial Engineering
ISSN (Print)1431-8598
ISSN (Electronic)2197-1773

Fingerprint

Dive into the research topics of 'Markov Chains: Basic Definitions'. Together they form a unique fingerprint.

Cite this