1 / 31

Mobile Communicaton Engineering

Mobile Communicaton Engineering. Review on Fundamental Limits on Communications Hyuckjae Lee. Fundamental Limits on Performance. Given an information source, and a noisy channel 1) Limit on the minimum number of bits per symbol 2) Limit on the maximum rate for reliable

anthea
Download Presentation

Mobile Communicaton Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mobile CommunicatonEngineering Review on Fundamental Limits on Communications Hyuckjae Lee

  2. Fundamental Limits on Performance • Given an information source, and a noisy channel 1) Limit on the minimum number of bits per symbol 2) Limit on the maximum rate for reliable communication  Shannon’s 3 theorems

  3. Uncertainty, Information and Entropy • Let the source alphabet, with the prob. of occurrence • Assume the discrete memoryless source (DMS) What is the measure of information?

  4. Uncertainty, Information, and Entropy (cont’) Interrelations between info., uncertainty or surprise No surprise no information If A is a surprise and B is another surprise, then what is the total info. of simultaneous A and B The amount of info may be related to the inverse of the prob. of occurrence.

  5. Property of Information 1) 2) 3) 4) * Custom is to use logarithm of base 2

  6. Entropy • Def. : measure of average information contents per source symbol The mean value of over S, The property of H 1) H(S)=0,iff for some k, and all other No Uncertainty 2) H(S)= Maximum Uncertainty

  7. Entropy of Binary MemorylessSource • For binary source of symbol 0 with , and symbol 1 with

  8. Source Coding Theorem • Source encoding Efficient representation of data compaction Be uniquely decodable Need of statistics of the source (There is an algorithm called “Lempel-Ziv” for unknown statistics of the source read Proakis’ book) ( Another frequent method is Run-Length code)

  9. Source Coding Theorem (cont’) • Variable length code Fixed length code The average code-Length, , is The coding efficiency, where is the minimum possible value of Source Encoder binary sequence DMS

  10. Shannon’s first theorem : Source-coding theorem • Given a dms of entropy H(S), the average code-word length for any source coding is i.e.)

  11. Practical Source Coding • Prefix coding Def. : A code in which no code-word is the prefix of any other code-word Ex)

  12. Initial state 0 1 0 Initial state 0 0 1 0 1 1 0 1 Practical Source Coding (cont’) 1 Decoding Equalityholds under one condition that where o denotes terminal state

  13. Huffman Coding • Property a prefix code average word length to fundamental limit , H(S) optimum • Algorithm shown by ex.

  14. Huffman Coding (cont’) • The result is Then, while, H(S) = 2.12193 • Huffman encoding is not unique. 1) 0 1 trivial 1 0 or

  15. Practical Source Coding (cont’) Set the combined symbol with equal prob. , a) as high as possible or b) as low as possible get the same average code-word length, but with different variance

  16. Discrete Memoryless Channel • Definition of DMC Channel with input X & output Y which is noisy version of X. Discrete when both of alphabets X & Y finite sizes. Memoryless when no dependency between input symbols.

  17. Channel Matrix (Transition Probability Matrix) The size is J by K for all j a priori prob. is : Discrete Memoryless Channel (cont’)

  18. Discrete Memoryless Channel (cont’) • Given a priori prob. , and the channel matrix, P then we can find the prob. of the various output symbols, as the joint prob. dist’nof X and Y the marginal prob. dist’n of the output Y,

  19. Discrete Memoryless Channel(cont’) • BSC (Binary Symmetric Channel) 1-p p p 1-p

  20. Mutual Information • Conditional Entropy • The mean value -> H(X|Y) : a conditional entropy (equivocation) The amount of uncertainty remaining about the channel input data after the channel output has been observed. • Mutual Information : The uncertainty of the input resolved by observing output I(X;Y) ≡ H(X) - H(X|Y) ,and

  21. H(X|Y) I(X;Y) H(Y|X) H(X) H(Y) Properties of Mutual Information( simple ex. needed for 2 by 2 DMC) • Symmetric : • Non-negative : where H(X,Y)

  22. Channel Capacity • For a dms with input X, output Y, & , where • I(X;Y) just depends upon , & channel. Since is indep. of the channel, it is possible to maximize I(X;Y) w.r.t. . • Def. of Channel Capacity. (bits per channel use)

  23. Ex.) for BSC

  24. Channel Coding Theorem • For reliable communication , needs channel encoding & decoding. “any coding scheme which gives the error as small as possible, and which is efficient enough that code rate is not too small?” => Shannon’s second theorem (noisy coding theorem) Let dms with alphabet X have entropy H(X) and produce symbols once every Ts, and dmc have capacity C and be used once every Tc . Then, i) if , there exists a coding scheme. ⅱ) if , it is not possible to transmit with arbitrary small error.

  25. Ex.) for BSC with The condition for reliable comm. , Let be r , then for , there exists a code (with code rate less than or equal to C) capable of achieving an arbitrary low probability of error. “ The code rate where k is k-bit input, and n is n-bit coded bits,.”

  26. Differential Entropy • Differential Entropy whereis p.d.f. -extension into continuous r.v. • Basis to derive the Channel Capacity Theorem

  27. Maximum Differential Entropy for Specified Variance

  28. Maximum Differential Entropy for Specified Variance

  29. Mutual Information for Continuous r.v

  30. Shannon’s Channel Capacity Theorem

  31. Bandwidth efficient diagram

More Related