written 2.7 years ago by |
Channel capacity theorem also known as Shannon's Theorem.
Theorem says,
$C = B \ log_2 \ (1 + \frac{S}{N}) \ b/s$
For error free communication.
Shannon's Theorem is related with the rate of information transmission over a communication channel, The form communication channel cares all the features and component arty the transmission system which introduce noise or limit the band width.
According to Shannon's Theorem, it is possible in principle to devise a means whereby a communication channel will transmit information with an small probability of error provided equal to a rate $c_1$ E j e channel capacity.
Statement of Shannon's Theorem is as follows:
Given a source of "M" equally likely message with M > > F which is generating information at a rate R then if R > C.
The probability of error is close to unity for every possible set of M transmitter signal.
It means if the information rate R exceed a specific value C, the error probability will approach towards unity M increases.
Hence channel capacity C of a communication channel is its very important characteristics.
Shann introduced a formula to determine the theoretical highest data rate for a channel.
$C = B \ log_2 \ (1 + SNR)$
Where B $\rightarrow$ Bandwidth of the channel.
SNR $\rightarrow$ Signal to noise ratio.
C $\rightarrow$ Shannon capacity of the channel in bps.