written 8.4 years ago by | • modified 8.4 years ago |
Mumbai University > Electronics and Telecommunication > Sem5 > Random Signal Analysis
Marks: 5M
Year: Dec 2014, May 2015
written 8.4 years ago by | • modified 8.4 years ago |
Mumbai University > Electronics and Telecommunication > Sem5 > Random Signal Analysis
Marks: 5M
Year: Dec 2014, May 2015
written 8.4 years ago by |
Markov Processes
The stochastic process X (t) is referred to as a Markov Process if it satisfies the Markovian Property (i.e. memoryless property). This states that -
$$P {\{X_{tn+1}=x_{n+1} | X_{tn}=x_{n} ...... X-{t1}=x_1}\} = P{\{X_{tn+1}=x_{n+1} | X_{tn}=x_n }\}$$
for any choice of time instants $ti , i =1,……, n$ where tj > tk for j>k .
This is referred to as the Memoryless property as the state of the system at future time $t_{n+1}$ is only decided by the system state at the current time tn and does not depend on the state at the earlier time instants $t1 ,…….., tn-1$.
Restricted versions of the Markov Property lead to different types of Markov Processes. These may be classified based on whether the state space is a continuous variable or discrete and whether the process is observed over continuous time or only at discrete time instants.
These may be summarized as -
(a) Markov Chains over a Discrete State Space
(b) Discrete Time and Continuous Time Markov Processes and Markov Chains
A Homogenous Markov Chain is one where the transition probability $P{\{Xn+1=j | Xn = i \}}$ is the same regardless of n.
Therefore, the transition probability $p_{ij}$ of going from state i to state j may be written as
Note that this property implies that if the system is in state i at the $n^{th}$ time instant then the probability of finding the system in state j at the $(n+1)^{th}$ time instant (i.e. the next time instant) does not depend on when we observe the system, i.e. the particular choice of n. This simplifies the description of the system considerably as the transition probability $p_{ij}$ will be the same, regardless of when we choose to observe the system.
Example: Communication System:
Consider a communication sustem in which the digits 0 and 1 are transmitted through several stages. At each stage the probability that the entering digit will be transmitted unchanged is p. Let $X_n$ denote the digit entering the nth step, then ${X_n,n=0,1,2,……..}$ is a Markov Chain with two states.