0
941views
Entropy and information rate.
1 Answer
0
15views

1] Entrophy : The entropy is defined as the average information per message. It is denoted by H and its units are bit/message.

The entropy must be as high as possible in order to ensure maximum transfer of information.

$Entrophy, H = \sum^m_k \ = \ 1 \ pk \ log_2 \ (\frac{1}{pk})$

2] Information Rate (R) :

If the source of the messages generates "r" number of messages per second then the information rate is given as

R = r x H

where r $\rightarrow$ Number of messages/sec

H $\rightarrow$ Average information/message

$R = [ r \frac{messages}{seconds}] \times [ H \frac{information}{message}]$

R $\rightarrow$ Average information per second expressed in bit/sec

Please log in to add an answer.