Markov Chain: Definition and Representation with Solved Problems

A Markov Process is said to be Markov Chain if it assumes values from Discrete State Space (E).
A Markov Chain is a Discrete Time Markov Process for which the future behavior depends on the current/Present state only not the past states. This property is also called the memoryless property of Markov Chain or Markovian Property.

Jump to Probability and Queuing Theory Index Page

A Discrete Parameter Markov Chain (MC) is represented as {Xn,n ≥ 0}.
P(Xn+1 = j / X0 = i0, X1 = i1... ,Xn = i )
P(Xn+1 = j / Xn = i )
Pij(n,n+1)
Pij(1)


Representation of Discrete Parameter Markov Chain

The joint probability distribution of the Markov Chain is represented in two ways. They are:
  1. Step Transitional Probability
  2. Transition Probability Matrix

Step Transitional Probability

P(X= i0, X= i1, Xn-1 = in-1, X= in)
= P(X= in /X= i0, X= i1, Xn-1 = in-1).P(X= i0, X= i1, Xn-1 = in-1)
= P(X= in / Xn-1 = in-1).P(X= i0, X= i1, Xn-1 = in-1)
= Pin-1.in(1).P(X= i0, X= i1, Xn-1 = in-1)
= Pin-1.in(1).P(Xn-1 = in-1 / X= i0, X= i1,....., Xn-2 = in-2 ).P(X= i0, X= i1,....., Xn-2)
= Pin-1.in(1).Pin-2.in-1(1).P(X= i0, X= i1,....., Xn-2)
= Pin-1.in(1).Pin-2.in-1(1).Pin-3.in-2(1).......Pi0.i0(1).P(X= i0)

where P(X= i0) is an initial probability vector.



Transition Probability Matrix


Consider a Markov Chain {Xn,n ≥ 0} with discrete state space {0,1,2,3,......}, Then the Markov Chain can be determined by Transition Proability matrix as follows:

Transition Probability Matrix




Solved Problems of Markov Chain

Solved Problem 1: Consider the sample weather model in which there are two states rainy(denoted by 0) and sunny (denoted by 1). The Transition probability Matrix is given by
P = ( 0.5 0.5 )
0.1 0.9
If the initial probability vector is P(0) = (1   0), then what is the probability that the third day will be sunny?

=Solution:

Solved Problem 2: The transition Probability Matrix of a Markov Chain having three states 1,2 and 3 is
and the initial Probability distribution is P(0) = (0.5  0.3   0.2). Find the following

i) P(X= 2)
ii) P(X= 3, X= 2, X= 1, X= 3)
=


ii. P(X= 3, X= 2, X= 1, X= 3)
 = P(X= 3 / X= 2, X= 1, X= 3).P(X= 2, X= 1, X= 3)
 = P(X= 3 / X= 2).P(X= 2, X= 1, X= 3)   [Using Markovian Property]
 = P23(1).P(X= 2 / X= 1, X= 3).P(X= 1, X= 3)
 = P23(1).P(X= 2 / X= 1).P(X= 1, X= 3)     [Using Markovian Property]
 = P23(1).P12(1).P(X= 1 / X= 3).P(X= 3)
 = P23(1).P12(1).P31(1).P(X= 3)  
 = 0.3 * 0.3 * 0.4 * 0.2
 = 0.0072



Related Posts:



Markov Chain: Definition and Representation with Solved Problems Markov Chain: Definition and Representation with Solved Problems Reviewed by Sandesh Shrestha on 24 June Rating: 5

No comments:

Powered by Blogger.