Probability: Markov chains: introduction

The tutor is happy to introduce the elegant topic of Markov chains.

A Markov chain is a sequence of states through which a probability system can pass. It’s not so complex as it sounds. Consider the following example:

Ms A must choose each day between a vegetarian lunch or a meat one. Here is a description of the parameters:

1: veggie

2: meat

1,1: veggie today, then veggie tomorrow

1,2: veggie today, but meat tomorrow

2,1: meat today, veggie tomorrow

2,2: meat today, meat tomorrow

In the probability matrix, the entries are referred to by (row, column).

Now let’s imagine the probabilities associated with each choice set are as follows:

1,1: 0.55 (if veggie today, then 55% probability of veggie tomorrow as well)

1,2: 0.45 (if veggie today, then 45% probability of meat tomorrow)

2,1: 0.31 (if meat today, then 31% probability of veggie tomorrow)

2,2: 0.69 (if meat today, then 69% probability of meat tomorrow, too)

In matrix form:

Such a matrix can be called a probability matrix; another name is transition matrix.

Observations:

1) Being probabilities, each entry e must satisfy 0≤e≤1
2) The entries in a given row must sum to 1, since they represent all possibilites.
3) The diagonal entries (1,1 and 2,2) signify the probability of remaining the same.

A square matrix that satsifies 1) and 2) is called stochastic.

This is a good first step on our journey through Markov chains.

HTH:)

Sources:

Tan, Soo Tang. Applied Finite Mathematics, 3rd Ed. Boston: PWS-Kent Publishing   Company, 1990.

Jack of Oracle Tutoring by Jack and Diane, Campbell River, BC.

Tagged with: , , , , , ,

Leave a Reply