A Markov Process is said to be Markov Chain if it assumes values from Discrete State Space (E).
A Markov Chain is a Discrete Time Markov Process for which the future behavior depends on the current/Present state only not the past states. This property is also called the memoryless property of Markov Chain or Markovian Property.
A Discrete Parameter Markov Chain (MC) is represented as {Xn,n ≥ 0}.
P(Xn+1 = j / X0 = i0, X1 = i1... ,Xn = i )
P(Xn+1 = j / Xn = i )
Pij(n,n+1)
Pij(1)
If the initial probability vector is P(0) = (1 0), then what is the probability that the third day will be sunny?
=Solution:
Solved Problem 2: The transition Probability Matrix of a Markov Chain having three states 1,2 and 3 is
and the initial Probability distribution is P(0) = (0.5 0.3 0.2). Find the following
i) P(X2 = 2)
ii) P(X3 = 3, X2 = 2, X1 = 1, X0 = 3)
=
ii. P(X3 = 3, X2 = 2, X1 = 1, X0 = 3)
= P(X3 = 3 / X2 = 2, X1 = 1, X0 = 3).P(X2 = 2, X1 = 1, X0 = 3)
= P(X3 = 3 / X2 = 2).P(X2 = 2, X1 = 1, X0 = 3) [Using Markovian Property]
= P23(1).P(X2 = 2 / X1 = 1, X0 = 3).P(X1 = 1, X0 = 3)
= P23(1).P(X2 = 2 / X1 = 1).P(X1 = 1, X0 = 3) [Using Markovian Property]
= P23(1).P12(1).P(X1 = 1 / X0 = 3).P(X0 = 3)
= P23(1).P12(1).P31(1).P(X0 = 3)
= 0.3 * 0.3 * 0.4 * 0.2
= 0.0072
Related Posts:
A Markov Chain is a Discrete Time Markov Process for which the future behavior depends on the current/Present state only not the past states. This property is also called the memoryless property of Markov Chain or Markovian Property.
Jump to Probability and Queuing Theory Index Page
A Discrete Parameter Markov Chain (MC) is represented as {Xn,n ≥ 0}.
P(Xn+1 = j / X0 = i0, X1 = i1... ,Xn = i )
P(Xn+1 = j / Xn = i )
Pij(n,n+1)
Pij(1)
Representation of Discrete Parameter Markov Chain
The joint probability distribution of the Markov Chain is represented in two ways. They are:
- Step Transitional Probability
- Transition Probability Matrix
Step Transitional Probability
P(X0 = i0, X1 = i1, Xn-1 = in-1, Xn = in)
= P(Xn = in /X0 = i0, X1 = i1, Xn-1 = in-1).P(X0 = i0, X1 = i1, Xn-1 = in-1)
= P(Xn = in / Xn-1 = in-1).P(X0 = i0, X1 = i1, Xn-1 = in-1)
= Pin-1.in(1).P(X0 = i0, X1 = i1, Xn-1 = in-1)
= Pin-1.in(1).P(Xn-1 = in-1 / X0 = i0, X1 = i1,....., Xn-2 = in-2 ).P(X0 = i0, X1 = i1,....., Xn-2)
= Pin-1.in(1).Pin-2.in-1(1).P(X0 = i0, X1 = i1,....., Xn-2)
= Pin-1.in(1).Pin-2.in-1(1).Pin-3.in-2(1).......Pi0.i0(1).P(X0 = i0)
where P(X0 = i0) is an initial probability vector.
= P(Xn = in /X0 = i0, X1 = i1, Xn-1 = in-1).P(X0 = i0, X1 = i1, Xn-1 = in-1)
= P(Xn = in / Xn-1 = in-1).P(X0 = i0, X1 = i1, Xn-1 = in-1)
= Pin-1.in(1).P(X0 = i0, X1 = i1, Xn-1 = in-1)
= Pin-1.in(1).P(Xn-1 = in-1 / X0 = i0, X1 = i1,....., Xn-2 = in-2 ).P(X0 = i0, X1 = i1,....., Xn-2)
= Pin-1.in(1).Pin-2.in-1(1).P(X0 = i0, X1 = i1,....., Xn-2)
= Pin-1.in(1).Pin-2.in-1(1).Pin-3.in-2(1).......Pi0.i0(1).P(X0 = i0)
where P(X0 = i0) is an initial probability vector.
Transition Probability Matrix
Consider a Markov Chain {Xn,n ≥ 0} with discrete state space {0,1,2,3,......}, Then the Markov Chain can be determined by Transition Proability matrix as follows:
Solved Problems of Markov Chain
Solved Problem 1: Consider the sample weather model in which there are two states rainy(denoted by 0) and sunny (denoted by 1). The Transition probability Matrix is given by
P | = | ( | 0.5 | 0.5 | ) |
0.1 | 0.9 |
=Solution:
Solved Problem 2: The transition Probability Matrix of a Markov Chain having three states 1,2 and 3 is
and the initial Probability distribution is P(0) = (0.5 0.3 0.2). Find the following
ii) P(X3 = 3, X2 = 2, X1 = 1, X0 = 3)
=
ii. P(X3 = 3, X2 = 2, X1 = 1, X0 = 3)
= P(X3 = 3 / X2 = 2, X1 = 1, X0 = 3).P(X2 = 2, X1 = 1, X0 = 3)
= P(X3 = 3 / X2 = 2).P(X2 = 2, X1 = 1, X0 = 3) [Using Markovian Property]
= P23(1).P(X2 = 2 / X1 = 1, X0 = 3).P(X1 = 1, X0 = 3)
= P23(1).P(X2 = 2 / X1 = 1).P(X1 = 1, X0 = 3) [Using Markovian Property]
= P23(1).P12(1).P(X1 = 1 / X0 = 3).P(X0 = 3)
= P23(1).P12(1).P31(1).P(X0 = 3)
= 0.3 * 0.3 * 0.4 * 0.2
= 0.0072
Related Posts:
- Stochastic Process- Definition, Specifications, and Classification
- Stationary Distribution | Steady State Distribution with Solved Problems
- n-Step Transition Probability Matrix of a Two-state Markov Chain with Solved problems
- Classification Of States and Periodicity of Markov Chain with Solved Problems
- Birth Death Process
- Chapman-Kolmogorov Equation
Markov Chain: Definition and Representation with Solved Problems
Reviewed by Sandesh Shrestha
on
24 June
Rating:
No comments: