written 7.8 years ago by | • modified 7.8 years ago |
**Mumbai University > Electronics and Telecommunication Engineering > Sem 5 > Random Signal Analysis
Marks: 10M
Year: Dec 2015
written 7.8 years ago by | • modified 7.8 years ago |
**Mumbai University > Electronics and Telecommunication Engineering > Sem 5 > Random Signal Analysis
Marks: 10M
Year: Dec 2015
written 7.8 years ago by | • modified 7.8 years ago |
- CDF (Cumulative Distribution Function)/PDF(Probability Distribution Function)
Definition: If X is a real random variable, then the function F: R→R defined by
$F_X$ (x)=P(X≤x) , where-∞ < x <∞
is called distribution function (d.f) of the random variable X. It is also known as Cumulative Distribution Function (c.d.f). $F_X$ (x)=∑$p_i$ where i is such that $x_i$ ≤ x, $p_i$ ≥ 0 and $\sum \limits_{k = 0}^nPi = 1$
Properties:
$F_X$ (-∞)=0
$F_X$ (∞)=1
0≤$F_X$ (x)≤1
$F_X$ ($x_1$) ≤ $F_X$ ($x_2$ ) if $x_1$ < $x_2$ i.e. c.d.f is monotonically increasing function.
P($x_1$ < X < $x_2$ ) = $F_X$ ($x_2$ ) - $F_X$ ($x_1$ )
$F_X$ ($x^+$ ) = $F_X$ (x) . The function is continuous on the right.
F' (x) = f(x)
If X is a discrete random variable then the distribution function can be obtained by adding the probabilities successively. Suppose X takes values $x_1$,$x_2$,…….,$x_n$… with probabilities p($x_1$ ),p($x_2$ )……p($x_n$ )….such that p($x_i$ )=1, then by the definition of the distribution function $F_X$ (x) we have
$F_X$ ($x_i$ ) = p(X ≤ $x_i$)
$F_X$ ($x_i$ ) = p($x_1$)+p($x_2$)+⋯…+p($x_i$)
So for discrete random variable X, $F_X$ (x) is a monotonically increasing, right continuous, step function.
If X is continuous random variable then the c.d.f is given as
$F_{X} (x) = \int \limits_{- \infty}^x F_{X} (x) dx$
-pdf(Probability density function)
Definition: The probability density function $f_X$ (x) of a continuous random variable X is defined such that
P(x-dx/2 ≤ X ≤ x+dx/2)=$f_X$ (x).dx and satisfies the following conditions:
$f_X$ (x) is integrable over the range(-∞,∞)
$f_X$ (x) ≥ 0 for all x , -∞ < x < ∞
$\int \limits_{- \infty}^ \infty f_X (x) dx$
The total areas under the curve is one.
Properties:
* The pdf f_X (x) of a continuous random variable X satisfies the following properties: *
0 ≤ $f_X$ (x) for all x A random variable X may assume negative values from -∞ to 0. But, $f_X$ (x) cannot take negative values as it defines probability measures.
$\int \limits_{- \infty}^ \infty f_X (x) dx$ =1
P($x_1$ < X ≤ $x_2$ )= $\int \limits_{x_1} ^ {x_2} f_X (x) dx$
P($x_1$ < X ≤ $x_2$ ) = P($x_1$ < X < $x_2$)
P(X=a) = $∫_a^a$$f_X$ (x)=0 - $f_X$ (x) = d/dx $F_X$ (x) if the derivative exists. **PMF (Probability Mass Function)** **Definition:** For a discrete random variable X taking atmost a countably infinite number of values $x_1$,$x_2$……., the probability mass function P is defined as P(X=$x_i$ )=$p_i$
So,$ F_X$ (x)= $\sum \limits_{i=1}^n P(X= {x_i}) u (x-{x_i})$ where u(.) is defined as the unit step function given by
u(x) =1, if x ≥ 0
=0 if x<0
Hence the probability distribution function is defined on the entire real line and not just on the range of the random variable. Indeed, P(X=$x_i$) represents the step size or jump at $x_i$ of the $F_X$ (x) curve i.e.
P(X=$x_i$ )=$F_X$ ($x_i$ ) - $F _X({x_{i-1}})$