1
6.2kviews
Give the proper definition for entropy and information rate.
1 Answer
2
227views

Entropy : The average information per message of a source m is called its entropy denoted by H(m) Hence,

$H(m) = \sum^n_i=1 \ pi \ 7i $ bits

= $\sum^n_i = 1 \ pi \ log \ (\frac{1}{pi}) $ bits

=$ - \sum^n_n = 1 \ pi \ log \ \pi $ bits

The entropy of a source is a function of the message probabilities, it is interesting to find the message probability distribution that yields the maximum entropy, because the entropy is a measure of uncertainty the probability distribution that generates the maximum uncertainty will have the maximum entropy.

It is the probability of occurrence of the possible value of the source symbol. thus equation gives the entropy in the units of bits (per symbol) because it uses a logarithm of base 2.

Information rate (R)

Information rate, R = r H

H = entropy or average information

r = rate at which message are generated.

R = $(r_{in} \frac{message}{second}) \times (H_{in} \frac{information \ bits}{message})$

= Information bits/second

Thus, information rate is represented in average number of bits of information per second.

r = lim H ($X_n | X_n-1, X_n-2, X_n-3 ---$);

$n \rightarrow \infty$

For memory less sources, this is merely the entropy of each symbol, while in the case of a stationery stochastic process,

The conditional entropy of a symbol given all the previews symbols generated, for the more general case of process that is not necessary, the average rate is

$r = lim \frac{1}{h} H (X_1, X_2 --- X_n)$;

$n \rightarrow \infty$

Please log in to add an answer.