written 7.7 years ago by | modified 2.6 years ago by |
Mumbai University >Information Technology> Sem 4 > Information Theory & Coding
Marks: 4 Marks
Year: May 2016
written 7.7 years ago by | modified 2.6 years ago by |
Mumbai University >Information Technology> Sem 4 > Information Theory & Coding
Marks: 4 Marks
Year: May 2016
written 7.7 years ago by |
The most fundamental concept of information theory is the entropy. The entropy is defined as average amount of information per message. The entropy of a random variable X is defined by,
H(X) =-Σx P(x) log p(x)
H(X)≥ 0, entropy is always non-negative. H(X)=0 if X is deterministic.
There are two types of Entropy:
Joint Entropy:
Joint entropy is entropy of joint probability distribution, or a multi valued random variables. If X and Y are discrete random variables and f(x, y) is the value of their joint probability distribution of (x, y), then the joint entropy of X and Y is
H(X, Y)=-Σ x є X Σ y є Y f(x, y) log f(x, y)
The joint entropy represents the amount of information needed on average to specify the value of two discrete random variables.
Conditional Entropy:
Average condition self-information is called as the condition entropy. If X and Y are discrete random variables and f(x, y) and f (y |x) are the values of their joints and conditional probability distributions, then:
H (Y|X) =-ΣxєXΣyє Y f(x, y) log f(y |x) is the conditional entropy of Y given X.