1 / 16

Quantity of Information (noiseless system)

A review on important bits of Part I. Quantity of Information (noiseless system). a) Depends on probability of event. b) Depends on length of message. Average Information: Entropy. probability of event. Source producing many symbols of probabilities etc. Maximum entropy.

eshleman
Download Presentation

Quantity of Information (noiseless system)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A review on important bits of Part I • Quantity of Information (noiseless system) a) Depends on probability of event. b) Depends on length of message. • Average Information: Entropy probability of event Source producing many symbols of probabilities etc.

  2. Maximum entropy For a binary source

  3. Redundancy • Conditional entropy H(j|i) If there is intersymbol influence, average information is given by Conditional probability (probability of j given i) Joint probability

  4. Coding in noiseless channel : Source coding (Speed of transmission is the main consideration ) • Important properties of codes • uniquely decodable (all combinations of code words distinct) • instantaneous (no code words a prefix of another) • compact (shorter code words given to more probable symbols)

  5. Important parameters: where is length (in binary digits) • Coding methods • Fano-Shannon method • Huffman’s Method

  6. Coding methods • Fano-Shannon method 1. Writing the symbol in a table in the order of descending order of probabilities ; 2. Dividing lines are inserted to successively divide the probabilities into halves, quarters, etc (or as near as possible); 3. A ‘0’ and ‘1’ are added to the code at each division. 4. Final code for each symbol is obtained by reading from towards each symbol.

  7. Coding methods • Huffman’s Method • 1. Writing the symbol in a table in the order of • descending order of probabilities ; • The probabilities are added in pairs from bottom • and reordered. • 3. A ‘0’ or ‘1’ is placed at each branch; • 4. Final code for each symbol is obtained by reading from towards each symbol.

  8. Codes: S1: 0 S2: 11 S3: 101 S4: 1000 S5: 1001

  9. L=0.5×1+0.2 ×2+ 0.1 ×3+2 ×0.1 ×4=2.0 H=1.96 E=0.98

  10. Shannon’s first theorem Shannon proved formally that if the source symbols are coded in groups of n, then the average length per symbol tends to the source entropy H as n tends to infinite. In consequence, a further increase in efficiency can be obtained by grouping the source symbols in groups, ( pairs, threes), and applying the coding procedure to the relevant probabilities of the chosen group. • Matching source to channel The coding process is sometimes known as ‘matching source to channel’ , that is to making the output of the coder as suitable as possible for the channel.

  11. Example An information source produces a long sequence of three independent symbols A, B, C with probabilities 16/20,3/20 and 1/20 respectively; 100 such symbols are produced per second. The information is to be transmitted via a noiseless binary channel which can transmit up to 100 binary digits per second. Design a suitable compact instantaneous code and find the probabilities of the binary digits produced. 0, 1 100 symbol/s channel decoder source coder P(A)=16/20, p(B)=3/20, p(C)=1/20 Coding singly, using Fano-Shannon method P(0)=0.73, p(1)=0.27

  12. L=1.865 per pair, R=93.25bits/s p(0)=0.547, p(1)=0.453. The entropy of the output stream is –(p(1)logp(0)+p(1)logp(1))=0.993 bits. close to maximum value of 1bit, (p(0)=p(1)).

More Related