1 / 13

A Mathematical Theory of Communication

Paper Review. A Mathematical Theory of Communication. By C.E. Shannon. Jin Woo Shin Sang Joon Kim. Contents. Introduction Summary of Paper Discussion. Introduction. This paper opened the information theory.

hank
Download Presentation

A Mathematical Theory of Communication

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Paper Review A Mathematical Theory of Communication By C.E. Shannon Jin Woo Shin Sang Joon Kim

  2. Contents • Introduction • Summary of Paper • Discussion

  3. Introduction • This paper opened the information theory. • Before this paper, people believed the only way to make the err. Prob. smaller is to reduce the data rate. • This paper revealed that there is an achievable positive data rate with negligible errors. C.E. Shannon

  4. Summary of Paper • Preliminary • Discrete Source & Discrete Channel • Discrete Source & Cont. Channel • Cont. Source & Cont. Channel

  5. [Summary of Paper]Preliminary • Entropy • Ergodic source Irreducible, aperiodic property • Capacity

  6. [Summary of Paper]Disc. Source & Disc. Channel • Capacity Theory (Theorem 11 at page 22) -The most important result of this paper If the discrete source entropy H is less than or equal to the channel capacity C then there exists a code that can be transmitted over the channel with arbitrarily small amount of errors. If H>C then there is no method of encoding which gives equivocation less than H-C.

  7. [Summary of Paper]Disc. Source & Cont. Channel • Domain size of input and output channel becomes infinity. • The capacity of a continuous channel is: • Tx rate does not exceed the channel capacity.

  8. [Summary of Paper]Cont. Source & Cont. Channel • Continuous source needs an infinite number of binary digits for exact specification. • Fidelity: the measurement of how much distortion we allow • Rate with fidelity constraint D of Cont. source P(X) is : with • For given fidelity constraint D,

  9. Discussion • Ergodic source • Practical approach • Rate distortion

  10. [Discussion]Ergodic source • Ergodic Source assumption is the essential one in the paper. • Source is ergodic -> AEP holds -> capacity theorem • Finding a source that is not ergodic and holds AEP is a meaningful work. • One example:

  11. [Discussion]Practical approach -1 • This paper provides the upper bound of achievable data rate. • Finding a good encoding scheme is another problem. • Turbo code, LDPC code are most efficient codes. • Block size, rate, BER, decoding complexity are important factors when choosing a code for a specific system.

  12. SNR vs. BER for rate 1/2 codes 0 10 -1 10 Uncoded -2 10 Turbo BER Code -3 10 Conv. Code ML decoding LDPC Bound -4 10 4 dB 0 1 2 3 4 5 6 SNR [Discussion]Practical approach -2 C. Berrou and A. Glavieux, "Near Optimum Error Correcting Coding And Decoding: Turbo-Codes," IEEE Trans. Comms., Vol.44, No.10, Oct 1996. ** This graph and chart are modified from the presentation data of Engling Yeo at Jan 15 2003

  13. [Discussion]Rate distortion • The ‘Fidelity’ concept motives ‘Rate Distortion’ theory. • Rate with D distortion(fidelity) of Discrete source P(x) is defined as:subject to • H(Entropy) is the rate with 0 distortion. • (The Rate Distortion Theory) We can compress a Disc. source P(x) up to ratio when allowing D distortion.

More Related