0
23kviews
Obtain Huffman coding word COMMITEE.
1 Answer
5
2.5kviews

Total number of symbols in the word COMMITTEE is 9.

$Probability \ = \ \frac{Total \ number \ of \ occurrence \ of \ symbol \ in \ message}{Total \ number \ of \ symbols \ in \ the \ message}$

Probability of a symbol C = p(C)= 1/9

Probability of a symbol O= p(O)=1/9

Probability of a symbol M=p(M)=2/9

Probability of a symbol I=p(I)= 1/9

Probability of a symbol T=p(T)= 2/9

Probability of a symbol E=p(E) =2/9

Step1: Arrange the symbols in descending order according to the probability.

Symbol Probability
M 2/9
T 2/9
E 2/9
C 1/9
O 1/9
I 1/9

Step 2: Construction of Huffman tree

enter image description here<\center>

Step 3: Codeword for the Huffman code Tree

Symbol Probability Binary Huffman Method
- - Codeword Word length
M 2/9 01 2
T 2/9 10 2
E 2/9 11 2
C 1/9 001 3
O 1/9 0000 4
I 1/9 0001 4

Finding Entropy:

$H(s) = - \sum_{K=0}^{n=1} PK Log 2 PK$

$H(s) = -(1/0.3010) \sum_{K=0}^{n=1} PK Log 10 PK$

$H(s) = (1/0.3010) \sum_{K=0}^{n=1} PK Log 10 \frac{1}{PK}$

Step 4: Determination of the average length $(\bar{L})$

The formula to calculate the average length is given by $\bar{L} = \sum_{K=0}^{n=1} PK IK$

Where,

$\bar{L}$ = average length of the symbol.

Pk= Probability of occurrence of the symbol.

Ik=Length of each symbol.

$\bar{L} = (2/9)x2 + (2/9)x2 +(2/9)x2 +(1/9)x3 +(1/9)x4 +(1/9)x4$

Solving this, we get the average length as

$\bar{L}$ = 2.5535 bits/symbol

Step 5: Determination of the entropy (H(s))

$H(s) = -(1/0.3010) (\frac{2}{9}) log 10 (\frac{2}{9}) +(\frac{2}{9}) log 10 (\frac{2}{9})+(\frac{2}{9}) log 10 (\frac{2}{9})+(\frac{1}{9}) log 10 (\frac{1}{9})+ (\frac{1}{9}) log 10 (\frac{1}{9})+ (\frac{1}{9}) log 10 (\frac{1}{9})$

Simplifying the value, the entropy is obtained as H(S)= 2.5034 bits/symbol.

Step 6: Determination of efficiency

$η = (\frac{2.5034}{2.5553}) = 0.9797 = 97.97 %$

Please log in to add an answer.