Â  Â  Â  Â
Â  Â  Â  Â  Â  Â

# Markov Chains

Let us talk about Markov chain introduction. Markov chain is a system in which one state undergoes transitions to another state where states are always in the countable form. It’s the random process in which the next state does not depend on the sequence of the events that precede it, but on the current state. This is based on the Markov property and named after the scientist Andrey Markov.
The Probability of moving from one state to another state in n time steps is
Pjk(n) = Pr ( Yn = k; Y0 = j )
This is the main formula for the transition from one state to another.
The single step transition can be written as
Pjk(n) = Pr (Y1 = k; y0 = j)
Time homogeneous Markov chain is
Pjk(n) = Pr (Yk+n = k; Yk = j )
Where n can be vary from 1 and so on.
If the Markov chain is a time homogeneous Markov chain (this means the process is described by a single, time independent matrix Pjk), then the vector π is known as a stationary distribution  if
Π ( t + 1 ) = ∑j£s π ( t ) Pij
A state can be accessed from one state to another if a system started in state j has non zero probability of transitioning into another state k at some Point. In short, it is accessible in transition from one state to another if an Integer n ≥ 0 such that
Pr ( Yn = k; Y0 = j ) = pjk(n) > 0
In the equation if n = 0 then every state is defined to be accessible from itself.
Sometimes discrete Markov Chains may or may not be adequate. Some points are discussed below about the  Markov chain.
·         Y(t), t ≥ 0 is a continuous Markov Chain i.e. it is a stochastic process taking values on a finite Set ( like 1, 2, 3 …) with the Markov property such that
P [ Y ( t + s) = k; Y(s) = j, Y ( u ) = y ( u ) where u lies between 0 and s ( 0 ≤ u ≤ s )] = P [ Y ( t + s ) = k Y ( s ) = j ]
·         The Markov property for Y ( t ) implies the discrete time Markov property for Yn such that
Yn will be an embedded Markov chain with the transition matrix P = [ Pij ].
·         In short  a CTMC is a process that stays in the state j for a time Tj = e(á¶Żi) after it moves to some other state j  with some probability Pjk.
·         Continuous time Markov process is the continuous version of Markov property.
·         Markov processes are used to describe physical processes in which a system involves a constant time. They are applied to ensemble independent systems besides a single system and the Probabilities are used to find to count the members in a state.

## Topics Covered in Markov Chains

The topic of our discussion is steady state behavior and by the name itself we can guess that when a system is in steady state it consists of so many properties which do not change with time or you can say they are steady. And if we want to define it mathematically then, as we know that if any system is not changing any of it’s properties with respect to time then...Read More

## Continuous Time Markov Chains

In probability Continuous time Markov chain is a random process represented as X (t): t ≥ 0. This property follows the Markov property and this is the continuous time segment of Markov chain. The Markov process tells that the Conditional Probability at a given time depends on the process at the same time. In other words the present state of process is indep...Read More

## Classification of States of Markov Chains

In Markov chain states three classifications are there for each state. Here each state in Markov chain falls into one and only one category, thus these categories are the base of the partition of the states. This categorization is made to find the communicating classes. The states of the Markov chain are partitioned in the categories we are talk...Read More

Math Topics
Top Scorers in Probability Worksheets