Apr 5, 2020

Information theory

If there’s a chance $p$ that your message gets flipped by the time it’s received at the other end of the line, is there any hope to transmit information in a way that on average, the probability of receiving the true message approaches 1?

·       Reliable communication: communication with probability of error that can be made arbitrarily small.

One way is to add redundancy:

You can repeat yourself (add redundancy) to make the transmission more reliable, but at the same time, the rate of transmission decreases. For example, if you send each bit 3 times, the rate of transmission drops to 1/3 bit per channel use. If adding more and more redundancy increases the reliability, one might think that an infinite number of redundancies is needed to achieve true reliability. In that case, is there hope for reliable communication at all?! Can we increase reliability without forcing the rate to 0? According to Shannon, for each channel, there is in fact a range of rates that’s achievable with arbitrary reliability – as high reliability as you want; as close to probability 1 as you want. The maximum of such rates is Channel Capacity.

Now I understand how Shannon's theory of communication is counter-intuitive: you thought it's impossible to transmit any message reliably, but Shannon shows that you can pick any reliability (close to 1) and still be able to transmit information at a range of rates. This whole paradox reminds me of Zeno's paradox in which you think it's impossible to cross the street, but turns out you actually can.

No comments:

Post a Comment