1 / 30

資料壓縮

資料壓縮. 資訊理論基本概念. 授課老師 : 陳建源 Email:cychen07@nuk.edu.tw 研究室 : 法 401 網站 http://www.csie.nuk.edu.tw/~cychen/. 1. Self-information. Let S be a system of events. in which. Def: The self-information of the event E k is written I(E k ):. The base of the logarithm: 2 (log) , e (ln).

Download Presentation

資料壓縮

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 資料壓縮 資訊理論基本概念 授課老師: 陳建源 Email:cychen07@nuk.edu.tw 研究室:法401 網站 http://www.csie.nuk.edu.tw/~cychen/

  2. 1. Self-information Let S be a system of events in which Def: The self-information of the event Ek is written I(Ek): The base of the logarithm: 2 (log) , e (ln) 單位:bit, nat

  3. 1. Self-information then when then when then when then when 愈小 愈大

  4. 1. Self-information Ex1. A letter is chosen at random from the Enlish alphabet. Ex2. A binary number of m digits is chosen at random.

  5. 1. Self-information Ex3. 64 points are arranged in a square grid. Ej be the event that a point picked at random in the jth column Ek be the event that a point picked at random in the kth row Why?

  6. 2. Entropy f: Ek→ fk E(f) be expectation or average or mean of f Let S be the system with events the associated probabilities being

  7. 2. Entropy Def: The entropy of S, called H(S), is the average of the self-information Self-information of an event increases as its uncertainty grows 觀察 Let certainty 最小值為0,表示已確定。但最大值呢?

  8. 2. Entropy Thm: with equality only when Proof:

  9. 2. Entropy Thm 2.2: For x>0 with equality only when x=1. Assume that pk≠0

  10. 2. Entropy

  11. 2. Entropy In the system S the probabilities p1 and p2 where p2> p1 are replaced by p1 +ε and p2-εrespectively under the proviso 0<2ε<p2-p1 . Prove the H(S) is increased. We know that entropy H(S) can be viewed as a measure of _____ about S. Please list 3 items for this blank. information uncertainty randomness

  12. 3. Mutual information Let S1 be the system with events the associated probabilities being Let S2 be the system with events the associated probabilities being

  13. 3. Mutual information Two systems S1 and S2 satisfying relation

  14. 3. Mutual information relation

  15. 3. Mutual information conditional probability conditional self-information mutual information NOTE:

  16. 3. Mutual information conditional entropy mutual information

  17. 3. Mutual information mutual information and conditional self-information If Ej and Fk are statistically independent

  18. 3. Mutual information joint entropy joint entropy and conditional entropy

  19. 3. Mutual information mutual information and conditional entropy

  20. 3. Mutual information Thm: mutual information of two systems cannot exceed the sum of their separate entropies

  21. 3. Mutual information System’s independent If S1 and S2 are statistically independent Joint entropy of two statistically independent systems is the sum of their separate entropies

  22. Ch2: Basic Concepts 2. 3 Mutual information Thm: with equality only if S1 and S2 are statistically independent Proof:Assume that pjk≠0

  23. Ch2: Basic Concepts 2. 3 Mutual information Thm: with equality only if S1 and S2 are statistically independent Proof:

  24. 3. Mutual information Ex: A binary symmetric channel with crossover probability ε 1-ε 0 0 ε ε 1 1 1-ε Let S1 be the input E0=0, E1=1 and S2 be the output F0=0, F1=1

  25. 3. Mutual information Assume that Then

  26. 3. Mutual information Compute the output Then then If

  27. 3. Mutual information Compute the mutual information

  28. 3. Mutual information Compute the mutual information

  29. 4. Differential entropy The differential entropy of f(x) is defined by whenever the integral exists. NOTE: • The entropy of a continuous distribution need not exist. • Entropy may be negative Def: The entropy of S, called H(S), is the average of the self-information

  30. 4. Differential entropy Example: Consider a random variable distributed uniformly from 0 to a so that its density is 1/a from 0 to a and 0 elsewhere. Then its differential entropy is whenever the integral exists.

More Related