written 8.5 years ago by | • modified 8.5 years ago |
Mumbai University > Electronics and Telecommunication > Sem5 > Random Signal Analysis
Marks: 4M
Year: Dec 2014
written 8.5 years ago by | • modified 8.5 years ago |
Mumbai University > Electronics and Telecommunication > Sem5 > Random Signal Analysis
Marks: 4M
Year: Dec 2014
written 8.5 years ago by |
The distribution of a random variable is often characterized in terms of its moment generating function (mgf), a real function whose derivatives at zero are equal to the moments of the random variable.
Let X is a random variable. If the expected value $E(e^{θX} )$ exists and is finite for all real numbers θ belonging to a closed interval [-h,h] ⊂ R with h > 0, then we say that X possesses a moment generating function and the function.
$$M_X (θ)=E(e^{θX} )$$
is called the moment generating function of X.
$$∴ M_X (θ)=E(e^{θX} )=\sum_xe^{θx}.P(X=x) $$ if X is discrete
$$M_X (θ)=E(e^{θX} )=∫_∞^{-∞}e^{θx} f_X (x)dx $$ if X is continuous
for all real θ for which the sum or integral converges absolutely.
Moments about the origin may be found by power series expansion: thus we may write
$$M_X (θ)=E(e^{θX} )$$
$$=E(\sum_{r=0}^∞\frac{(θX)^{r}}{r!} )$$
$$=(\sum_{r=0}^∞\frac{(θ)^{r}}{r!} E{X^r} ) $$
i.e.
$$M_X (θ)=(\sum_{r=0}^∞\frac{(θ)^r}{r!}.μ_r' ) $$
where $μ_r'=E(X^r) $