1 / 11

Claude Shannon – In Memoriam

Claude Shannon – In Memoriam. Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr. Outline. Claude Shannon Entropy Source Coding Channel Coding Separation Theorem. Claude Shannon. 4/30/1916 – 2/24/2001 37: Boolean Algebra  Logical Circuits

dthomson
Download Presentation

Claude Shannon – In Memoriam

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Claude Shannon – In Memoriam Jean Walrand U.C. Berkeley www.eecs.berkeley.edu/~wlr

  2. Outline • Claude Shannon • Entropy • Source Coding • Channel Coding • Separation Theorem

  3. Claude Shannon • 4/30/1916 – 2/24/2001 • 37: Boolean Algebra  Logical Circuits • 48: Mathematical Theory of Communications

  4. Entropy • How much information is required to convey the value of a random variable? • Key insight: The quantity of information is related to the uncertainty of that value. • Example 1: Coin Flip = One bit of information • Example 2: Two coin flips = 2 bits • Example 3: N equally likely values = log2(N) bits

  5. Source Coding • How do we encode the values to be able to convey them with the minimum number of bits? • Key idea: Look at sequence of outcomesX(1), X(2), …, X(n) where X(m) is in {1, 2, …, K} • For n large, there are only 2nH equally likely sequences where H is smaller than log2K • In fact, H = - S pi log2(pi)

  6. H 1 0 p 0 0.5 1 Source Coding (c’d) • Example:P(1) = p = 1 – P(0)H = - p log2p – (1 – p)log2 (1 – p)

  7. 2n n-bit words 2nH equally likely n-bit words. Source Coding (c’d) • Thus, for large n:

  8. Channel Capacity • Question: How fast can one transmit bits reliably through a noisy channel? • Naïve answer: No reliable transmission is possible. • Shannon’s formulation: What is the possible rate, in the long term, if one wants the bit error rate to be arbitrarily small? • Shannon’ s answer: Channel Capacity

  9. Sent Received 1 - p 0 0 p p 1 1 1 - p C 1 0 p 0 0.5 1 Channel Capacity (c’d) • Example:

  10. 2n equally likely n-bit words 2nH equally likely n-bit words for one word sent. Channel Capacity (c’d) • Justification: Choose 2nK n-bit words (fair coin flips) => 2n/2nH = 2n(1 – H) = 2nC distinguishable codewords

  11. Separation Theorem • Source with Entropy H bits per symbol • Channel with Capacity C bps • Can send C/H symbols per seconds • First code symbols (n symbols => nH bits) • Then code channel(send bits with suitable codewords) • Hence: Separate source and channel coding!

More Related