written 8.5 years ago by
teamques10
★ 68k
|
•
modified 8.5 years ago
|
A random sequence (discrete-time random process) is a sequence of r.v.’s , {${X_n}$}
$X_1,X_2,.......,X_n....$
For a specific $\omega,${${X_n}\omega$} may or may not converge.
Types of convergence
Convergence almost everywhere:(a.e) We say that a random sequence {${X_n}$}
converges almost everywhere to the random variable X if $P(\lim_{n\to∞})$ We write $X_n\to X$
i.e there exists a set of numbers ζ such that $\lim_{x\to ∞}X_n(ζ)=(ζ)$ with probability
Example: Consider an animal of some short-lived species. We note the exact amount of food that this animal consumes day by day. This sequence of numbers will be unpredictable in advance, but we may be quite certain that one day the number will become zero, and will stay zero forever after. That is, the sequence of numbers which denotes the amount of food consumed each day by the animal converges almost surely (or almost everywhere) to the r.v which takes the value 0.
Convergence in Probability: (p) We say that a random sequence {${X_n}$} converges to the random variable X in probability if $\lim_{x\to ∞} P[|X_n-X|\gtε]=0$ for any ε>0 .
Example : Suppose we would like to conduct the following experiment. First, pick a random person in the street. Let X be his/her height, which is a random variable. Then you start asking other people to estimate this height by eye. Let Xn be the average of the first n responses. Then (provided there is no systematic error) by the law of large numbers, the sequence Xn will converge in probability to the random variable X.
Convergence in Mean Square Sense: (MS) We say that a random sequence {${X_n}$} converges to the random variable in mean square if
$E[|X_n-X|]^2\to 0$ as $n\to \infty$
Convergence in Distribution: (d) Consider a random sequence {${X_n}$} of random variables with distribution functions $F_n(x)$ . Suppose $F_X(x)$ denotes the distribution function of a random variable X . If for every point of continuity of $F_X(x)$,
$F_n(x)\to F_X(x)$ as $n\to \infty$
Example: Suppose a new dice factory has just been built. The first few dice come out quite biased, due to imperfections in the production process. The outcome from tossing any of them will follow a distribution markedly different from the desired uniform distribution. As the factory is improved, the dice become less and less loaded, and the outcomes from tossing a newly produced die will follow the uniform distribution more and more closely.
Comparison of convergence Modes
a.e $\Rightarrow$ p $\Rightarrow$ d
MS $\Rightarrow$ p $\Rightarrow$ d
a.e does not imply MS and MS does not imply a.e