0
3.0kviews
Justify, Entropy of an image is maximized by histogram equalization.
1 Answer
1
120views

Histogram equalization gives us a flat histogram in continuous domain. As a result of this, the probability of occurrence of each grey level in the image is equal. If all grey levels are equal probable, the entropy is maximized. e.g. given 256 grey level images with equal probability of occurrence for each grey level, the maximum word length is given below.

$$H= \sum_{i=0}^{L-1}p_i .log_2 p_i \\ \hspace{2cm}= \sum_{i=0}^{L-1}\bigg(\frac{1}{256}\bigg) log_2 \bigg(\frac{1}{256}\bigg) \\ \hspace{0cm}= 8 \space \text{bits/pixel}$$

This simply means that an equal length code can be used on an image that has uniform probability density function.

Please log in to add an answer.