1 / 36

《 信息论与编码 》 《Information Theory and Coding》

《 信息论与编码 》 《Information Theory and Coding》. Dr. Li Chen Associate Professor, School of Information Science and Technology (SIST) Office: 631C, SIST building Email: chenli55@mail.sysu.edu.cn Web: sist.sysu.edu.cn/~chenli. 《Information Theory and Coding》 Textbooks:

adler
Download Presentation

《 信息论与编码 》 《Information Theory and Coding》

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 《信息论与编码》《Information Theory and Coding》 Dr. Li Chen Associate Professor, School of Information Science and Technology (SIST) Office: 631C, SIST building Email: chenli55@mail.sysu.edu.cn Web: sist.sysu.edu.cn/~chenli

  2. 《Information Theory and Coding》 Textbooks: 1.《Elements of Information Theory》, by T. Cover and J. Thomas, Wiley (and introduced by Tsinghua University Press), 2003. 2. 《Non-binary error control coding for wireless communication and data storage》, by R. Carrasco and M. Johnston, Wiley, 2008. 3. 《Error control Coding》, by S. Lin and D. Costello, Prentice Hall, 2004. 4. 《信息论与编码理论》, 王育民、李晖著,高等教育出版社,2013.

  3. Chapter 1: Fundamentals of Information Theory ( 2 W ) Chapter 2: Data Compression: Source Coding ( 2 W ) Chapter 3: An Introduction of Channel Coding ( 2 W ) Chapter 4: Convolutional Codes and Trellis Coded Modulation ( 4 W ) Chapter 5: Turbo Codes ( 2 W ) Chapter 6: Low Density Parity Check Codes ( 3 W ) Outlines

  4. Evolution of Communications Analogue comm. Late 80s to early 90s Digital comm. Information theory and coding techniques EE+CS

  5. Chapter 1 Fundamentals of Information Theory • 1.1 An Introduction of Information • 1.2 Measure of Information • 1.3 Average Information (Entropy) • 1.4 Channel Capacity

  6. §1.1 An Introduction of Information • What is information? • How do we measure information? Let us look at the following sentences: 1) I will be one year older next year. No information 2) I was born in 1993. a bit of information 3) I was born in 1990s. More information Boring! Being frank! Interesting, so which year? The number of possibilities should be linked to the information!

  7. §1.1 An Introduction of Information Let us do the following game: You have 6 possible outcomes. {1, 2, 3, 4, 5, 6} Throw a die once Throw three dies Information should be ‘additive’.

  8. §1.1 An Introduction of Information Let us look at the following problem. We can use ‘logarithm’ to scale down the a huge amount of possibilities. Number (binary bit) permutations are used to represent all possibilities.

  9. Finally, let us look into the following game. §1.1 An Introduction of Information

  10. §1.1 An Introduction into Information Measure of information should consider the probabilities of various possible events

  11. § 1.2 Measure of Information

  12. §1.2 Measure of Information • Observations:

  13. §1.2 Measure of Information

  14. §1.3 Average Information (Entropy (熵))

  15. §1.3 Average Information (Entropy) Example 1.2:

  16. §1.3 Average Information (Entropy) Binary Entropy function

  17. §1.3 Average Information (Entropy)

  18. §1.3 Average Information (Entropy) Mutual Information of a channel Sink Source Channel

  19. §1.3 Average Information (Entropy) How much we still don’t know AFTER the channel observations How much we don’t know BEFORE the channel observations 互信息

  20. §1.3 Average Information (Entropy) Properties of mutual information Fig. A Venn diagram

  21. §1.3 Average Information (Entropy) Given the binary symmetric channel shown as Example 1.3: 0.8 1 1 0.2 0.2 0.8 0 0 Please determine the mutual information of such a channel. Solution: • Entropy of the binary source is

  22. §1.3 Average Information (Entropy)

  23. §1.3 Average Information (Entropy)

  24. §1.4 Channel Capacity Sink Source Channel

  25. §1.4 Channel Capacity Channel Capacity for Binary Symmetric Channel (BSC)

  26. §1.4 Channel Capacity

  27. §1.4 Channel Capacity

  28. §1.4 Channel Capacity Channel Capacity for Binary Erasure Channel (BEC) 0 0 1 1

  29. §1.4 Channel Capacity

  30. §1.4 Channel Capacity

  31. §1.4 Channel Capacity

  32. §1.4 Channel Capacity Channel Capacity for Binary AWGN Channel

  33. §1.4 Channel Capacity

  34. §1.4 Channel Capacity

  35. §1.4 Channel Capacity

  36. §1.4 Channel Capacity Now, we are ready to materialize a communication system: Source data Source coding Channel coding Modulation CH Demodulation Channel decoding Estimation of source data Source decoding

More Related