0
6.0kviews
Explain Chebyshev's Inequality with suitable example.

**Mumbai University > Electronics and Telecommunication Engineering > Sem 5 > Random Signal Analysis

Marks: 10M

Year: Dec 2015

1 Answer
1
237views

Chebyshev’s inequality, also called Bienaymé-Chebyshev inequality, in probability theory, a theorem that characterizes the dispersion of data away from its mean (average).Chebyshev’s inequality puts an upper bound on the probability that an observation should be far from its mean. It requires only two minimal conditions:

(1) that the underlying distribution have a mean and

(2) that the average size of the deviations away from this mean (as gauged by the standard deviation) not be infinite. Chebyshev’s inequality then states that the probability that an observation will be more than k standard deviations from the mean is at most 1/k^2 . Chebyshev used the inequality to prove his version of the law of large numbers.

Definition:

For a random variable X with mean value μ, variance $σ^2_X$ and standard deviation σ we have,

P{|X-μ|≥kσ}≤1/$k^2$ for any k>0

Proof:

Let X be a continuous random variable.

By definition, $σ^2$=E$[X-E(X) ]^2$=E$({X-μ})^2$

=$∫_{-∞}^∞ {(x-μ)^2} {f_X} (x)dx$

Where $f_X$ (x) is the probability density function of X.

enter image description here

$σ^2$=$∫_{-∞}^{μ-kσ} {(x-μ)^2} {f_X} (x)dx$ +$ ∫_{μ-kσ}^{μ+kσ} {(x-μ)^2}{ f_X }(x)dx$+$∫_{μ+kσ}^∞ {(x-μ)^2}{ f_X }(x)dx$

∴ $σ^2$ ≥ $∫_{-∞}^{μ-kσ} {(x-μ)^2}{ f_X }(x)dx$ +$∫_{μ+kσ}^∞ {(x-μ)^2}{ f_X} (x)dx$

For the first integral x ≤ μ-kσ i.e. μ-x≥kσ and

For the second integral μ+kσ ≤ x

∴x-μ≥kσ

∴$σ^2$ ≥ $∫_{-∞}^{μ-kσ}${(kσ)^2}{ f_X} (x)dx$+$∫_{μ+kσ}^∞ ${(kσ)^2}{ f_X }(x)dx$ ∴$σ^2$ ≥ ${(kσ)^2}$ $(∫_{-∞}^{μ-kσ} {f_X} (x)dx$+$∫_{μ+kσ}^∞ {f_X} (x)dx)$

But $∫_{α^β} {f_X} (x)dx = P(α≤x≤β) ∴$σ^2$ ≥ $(kσ)^2$ (P(-∞ \lt X ≤ (μ-kσ) )+P(μ+kσ ≤ X \lt ∞) ) ∴$σ^2$≥$(kσ)^2$ (P((X-μ) ≤ -kσ)+P((X-μ) ≥ kσ) )

∴$σ^2$≥$(kσ)^2$ (P(|X-μ|≥kσ) )

∴P{|X-μ| ≥ kσ} ≤ 1/$k^2$ for any k>0

Or

P{|X-μ|<kσ} ≥="" 1-1="" $k^2$="" for="" any="" k="">0

Markov and Chebyshev are important inequalities since they are used to bound probabilities directly, when only the mean or both mean and the variance of the probability distribution are known.

Example: Consider X is a random variable denoting the number of complaints received at as service-station on a day with mean 20 and standard deviation 2. The probability that on a day the number of complaints will lie between 8 and 32 can be found using the Chebyshev inequality which is shown below:

Here μ=20 and σ=2

When x=32k=|x-μ|/σ=|32-20|/2=6

When x=8k=|x-μ|/σ=|8-20|/2=6

Using

P{|X-μ|<kσ} ≥="" 1-1="" $k^2$="" for="" any="" k="">0

P{|X-20|<12}≥1-1/$6^2$

P(8 < X < 32)=35/36

Please log in to add an answer.