1 / 31

符号理論 ...coding theory

符号理論 ...coding theory. The official language of this class is English slides and talks are (basically) in English I will accept questions and comments in Japanese also. omnibus-style lecture ... collection of several subjects “take-home” test ... questions are given, solve in your home.

Download Presentation

符号理論 ...coding theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 符号理論...coding theory • The official language of this class is English • slides and talks are (basically) in English • I will accept questions and comments in Japanese also. • omnibus-style lecture ... collection of several subjects • “take-home” test ... questions are given, solve in your home

  2. Il Nome dellaClasse / The Name of the Class Coding Theory? • a branch of Information Theory • properties/constructions of “codes” • source codes (for data compression) • channel codes (for error-correction) • and various codes for various purposes this class = middle-class of Information Theory, with some emphasis on the techniques of coding

  3. relation to Information Theory entropy measuring of information mutual information Kraft’s inequality Huffman code universal code source coding source coding theorem analysis of codes linear code convolutional code Hamming code channel coding channel coding theorem Turbo & LDPC codes codes for data recording network coding and more...

  4. class plan • seven classes, one test Oct. 9review brief review of information theory Oct. 16compress arithmetic code, universal codes Oct. 23analyze analysis of codes, weight distribution Oct. 30 no class Nov. 6struggle cyclic code, convolutional code Nov. 13Shannon channel coding theorem Nov. 20frontier Turbo code, LDPC code Nov. 27uniquecoding for various purposes take-home test slides ...http://isw3.naist.jp/~kaji/lecture/

  5. Information Theory Information Theory (情報理論) • is founded by C. E. Shannon in 1948 • focuses on mathematical theory of communication • gave essential impacts on today’s digital technology • wired/wireless communication/broadcasting • CD/DVD/HDD • data compression • cryptography, linguistics, bioinformatics, games, ... Claude E. Shannon 1916-2001

  6. the model of communication A communication system can be modeled as; C.E. Shannon, A Mathematical Theory of Communication, The Bell System Technical Journal, 27, pp. 379–423, 623–656, 1948. engineering artifacts Other components are “given” and not controllable.

  7. the first step precise measurement is essential in engineering vs. information cannot be measured • To handle information by engineering means, we need to develop a quantitative measure of information. Entropy makes it!

  8. the model of information source Information sourceis a machinery which produces symbols. • The symbol produced is determined probabilistically. • Use a random variable to represent the produced symbol. • takes either one value in . • denotes the probability that . (We mainly focus on memoryless & stationary sources.)

  9. entropy the entropy of : • expected value of over all • is sometimes called as a self information of . • is sometimes called as an expected information of . bit

  10. entropy and uncertainty (不確実さ) cheat dice...easier to guess bit bit • More difficulty to guess the value of correctly, • more entropyis. entropy = the size of uncertainty

  11. basic properties of entropy • ...【nonnegative】 • ... 【smallest value】 • when for one particular value in • ... 【largest value】 • when for all

  12. some more entropies • joint entropy • conditional entropy if and are independent, then

  13. mutual information • mutual information between and • if and are independent if & are independent:

  14. example binary symmetric channel (BSC) • is transmitted • is received 0 0 1 1 • compute ,assuming for simplicity, define a binary entropy function

  15. example solved • compute ,assuming

  16. good input and bad input • is a channel-specific constant • is a controllable parameter ... works poorly for input with or ... works finely for input with 0 0 1 1

  17. channel capacity channel capacity = maximum of with • = input to the channel • = output from the channel • the channel capacity of BSC = • the channel capacity of a binary erasure channel = , where is the probability of erasure Channel capacities of some practical channels are also studied.

  18. source coding encoder • A source coding is to give a representation of information. • The representation must be as small (short) as possible. codewords

  19. problem formulation • source symbol • construct a code , with a codeword for our goal is to construct C so that • is immediately decodable, and • the average codeword length of , is as small as possible. source symbols code

  20. source coding theorem Shannon’s source coding theorem: • There is no immediately decodable code with . • proof by Kraft’s inequality and Shannon’s lemma • We can construct an immediately decodable code with for any small . • construction of a block Huffman code ... two faces of source coding

  21. Huffman code Code construction by iterative tree operations • prepare isolated nodes, each attached with a probability of a symbol (node = size-one tree) • repeat the following operation until all trees are joined to one • select two trees and having the smallest probabilities • join and by introducing a new parent node • the sum of probabilities of and is given to the new tree David Huffman 1925-1999

  22. construction example prob. 0.2 0.1 0.3 0.3 0.1 codewords A B C D E

  23. problems of block Huffman code The optimum code is obtained by • grouping several symbols into one, and • applying Huffman code construction practical problems arise: • we need much storage • we need to know the probability distribution in advance ...solutions to these problems are discussed in this class.

  24. channel coding • Errors are unavoidable in communication. ABCADC ABCABC • Some errors are correctable by adding some redundancy. ABC Alpha, Bravo, Charlie • Channel coding gives a clever way to introduce the redundancy. Alpha, Bravo, Charlie ABC

  25. linear code linear code: practical class of channel codes • the encoding is made by using a generator matrix • codeword • the decoding is made by using a parity check matrix • syndrome • The syndrome indicates the position of errors.

  26. Hamming code To construct a one-bit error correcting code, let column vectors of parity check matrix all different. • Hamming code • determine a parameter • enumerate all nonzero vectors with length • use the vectors as columns of Richard Hamming 1915-1998 transpose

  27. 2 3 4 5 6 7 3 7 15 31 63 127 1 4 11 26 57 120 parameters of Hamming code Hamming code • determine • design to have different column vectors has rows and columns • code length • # of information symbols • # of parity symbols code rate =

  28. code rate and performance • If code rate = is large... • more information in one codeword • less number of symbols for error correction • The error-correcting capability is weak in general. strong error capability To have good error-correcting capability, we need to sacrifice the code rate... weak small code rate large

  29. channel coding theorem Shannon’s channel coding theorem: Let be the capacity of the communication channel. • Among channel codes with rate , there exists a code which can correct almost all errors. • There is no such codes in the class of codes with rate . ... two faces of channel coding

  30. two coding theorems • source coding theorem: • constructive solution given by Huffman code • almost finished work • channel coding theorem • no constructive solution • a number of studies have been made • still under investigation • remarkable classes of channel codes • proof of the theorem

  31. summary today’s talk ... not self-contained summary of Information Theory • measuring of information • source coding • channel coding Students are encouraged to review basics of Information Theory.

More Related