1 / 23

Entropy of Hidden Markov Processes

Entropy of Hidden Markov Processes. Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2. Overview. Introduction Problem Definition Statistical Mechanics approach Cover&Thomas Upper-Bounds Radius of Convergence Related subjects Future Directions.

yeriel
Download Presentation

Entropy of Hidden Markov Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Entropy of Hidden Markov Processes Or Zuk1 Ido Kanter2 Eytan Domany1 Weizmann Inst.1 Bar-Ilan Univ.2 .

  2. Overview • Introduction • Problem Definition • Statistical Mechanics approach • Cover&Thomas Upper-Bounds • Radius of Convergence • Related subjects • Future Directions

  3. Markov Process: X – Markov Process M – Transition Matrix Mij = Pr(Xn+1 = j| Xn = i) M Xn Xn+1 N N Yn Yn+1 HMP - Definitions • Hidden Markov Process : • Y – Noisy Observation of X • N – Noise/Emission Matrix • Nij = Pr(Yn = j| Xn = i)

  4. p(1|0) p(0|0) 0 p(1|1) 1 p(0|1) q(0|0) q(1|1) q(1|0) q(0|1) 1 0 Example: Binary HMP Transition Emission

  5. Example: Binary HMP (Cont.) • For simplicity, we will concentrate on Symmetric Binary HMP : • M = N = • So all properties of the process depend on two parameters, p and . Assume (w.l.o.g.) p,  < ½

  6. HMP Entropy Rate • Definition : H is difficult to compute, given as a Lyaponov Exponent (which is hard to compute generally.) [Jacquet et al 04] • What to do ? Calculate H in different Regimes.

  7. Different Regimes p -> 0 , p -> ½ ( fixed)  -> 0 ,  -> ½ (p fixed) [Ordentlich&Weissman 04] study several regimes. We concentrate on the ‘small noise regime’  -> 0. Solution can be given as a power-series in  :

  8. Statistical Mechanics First, observe the Markovian Property : Perform Change of Variables :

  9. - + + + - + + - - - - + + + + - Statistical Mechanics (cont.) Ising Model : ,  {-1,1} Spin Glasses 2 1 n J J K K n 2 1

  10. Statistical Mechanics (cont.) Summing, we get :

  11. Statistical Mechanics (cont.) Computing the Entropy (low-temperature/high-field expansion) :

  12. Cover&Thomas Bounds It is known (Cover & Thomas 1991) : • We will use the upper-bounds C(n), and derive their orders : • Qu : Do the orders ‘saturate’ ?

  13. Cover&Thomas Bounds (cont.) n=4

  14. Cover&Thomas Bounds (cont.) • Ans : Yes. In fact they ‘saturate’ sooner than would have been expected ! For n  (K+3)/2 they become constant. We therefore have : • Conjecture 1 : (proven for k=1) • How do the orders look ? Their expression is simpler when expressed using  = 1-2p, which is the 2nd eigenvalue of P. • Conjecture 2 :

  15. First Few Orders : • Note : H0-H2 proven. The rest are conjectures from the upper-bounds.

  16. First Few Orders (Cont.) :

  17. First Few Orders (Cont.) :

  18. Radius of Convergence : When is our approximation good ? Instructive : Compare to the I.I.D. model For HMP, the limit is unknown. We used the fit :

  19. Radius of Convergence (cont.) :

  20. Radius of Convergence (cont.) :

  21. Relative Entropy Rate • Relative entropy rate : • We get :

  22. Index of Coincidence • Take two realizations Y,Y’ (of length n) of the same HMP. What is the probability that they are equal ? Exponentially decaying with n. • We get : • Similarly, we can solve for three and four (but not five) realizations. Can give bounds on the entropy rate.

  23. Future Directions • Proving conjectures • Generalizations (e.g. any alphabets, continuous case) • Other regimes • Relative Entropy of two HMPs Thank You

More Related