entropy of hidden markov processes n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Entropy of Hidden Markov Processes PowerPoint Presentation
Download Presentation
Entropy of Hidden Markov Processes

Loading in 2 Seconds...

play fullscreen
1 / 23

Entropy of Hidden Markov Processes - PowerPoint PPT Presentation


  • 77 Views
  • Uploaded on

Entropy of Hidden Markov Processes. Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2. Overview. Introduction Problem Definition Statistical Mechanics approach Cover&Thomas Upper-Bounds Radius of Convergence Related subjects Future Directions.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Entropy of Hidden Markov Processes' - yeriel


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
entropy of hidden markov processes

Entropy of Hidden Markov Processes

Or Zuk1 Ido Kanter2 Eytan Domany1

Weizmann Inst.1 Bar-Ilan Univ.2

.

overview
Overview
  • Introduction
  • Problem Definition
  • Statistical Mechanics approach
  • Cover&Thomas Upper-Bounds
  • Radius of Convergence
  • Related subjects
  • Future Directions
hmp definitions
Markov Process:

X – Markov Process

M – Transition Matrix

Mij = Pr(Xn+1 = j| Xn = i)

M

Xn

Xn+1

N

N

Yn

Yn+1

HMP - Definitions
  • Hidden Markov Process :
  • Y – Noisy Observation of X
  • N – Noise/Emission Matrix
  • Nij = Pr(Yn = j| Xn = i)
example binary hmp

p(1|0)

p(0|0)

0

p(1|1)

1

p(0|1)

q(0|0)

q(1|1)

q(1|0)

q(0|1)

1

0

Example: Binary HMP

Transition

Emission

example binary hmp cont
Example: Binary HMP (Cont.)
  • For simplicity, we will concentrate on Symmetric Binary HMP :
  • M = N =
  • So all properties of the process depend on two parameters, p and . Assume (w.l.o.g.) p,  < ½
hmp entropy rate
HMP Entropy Rate
  • Definition :

H is difficult to compute, given as a Lyaponov Exponent (which is hard to compute generally.) [Jacquet et al 04]

  • What to do ? Calculate H in different Regimes.
different regimes
Different Regimes

p -> 0 , p -> ½ ( fixed)

 -> 0 ,  -> ½ (p fixed)

[Ordentlich&Weissman 04] study several regimes.

We concentrate on the ‘small noise regime’  -> 0.

Solution can be given as a power-series in  :

statistical mechanics
Statistical Mechanics

First, observe the Markovian Property :

Perform Change of Variables :

statistical mechanics cont

-

+

+

+

-

+

+

-

-

-

-

+

+

+

+

-

Statistical Mechanics (cont.)

Ising Model :

,  {-1,1} Spin Glasses

2

1

n

J

J

K

K

n

2

1

statistical mechanics cont2
Statistical Mechanics (cont.)

Computing the Entropy (low-temperature/high-field expansion) :

cover thomas bounds
Cover&Thomas Bounds

It is known (Cover & Thomas 1991) :

  • We will use the upper-bounds C(n), and derive their orders :
  • Qu : Do the orders ‘saturate’ ?
cover thomas bounds cont1
Cover&Thomas Bounds (cont.)
  • Ans : Yes. In fact they ‘saturate’ sooner than would have

been expected ! For n  (K+3)/2 they become constant.

We therefore have :

  • Conjecture 1 : (proven for k=1)
  • How do the orders look ? Their expression is simpler when expressed using  = 1-2p, which is the 2nd eigenvalue of P.
  • Conjecture 2 :
first few orders
First Few Orders :
  • Note : H0-H2 proven. The rest are conjectures from the upper-bounds.
radius of convergence
Radius of Convergence :

When is our approximation good ?

Instructive : Compare to the I.I.D. model

For HMP, the limit is unknown. We used the fit :

relative entropy rate
Relative Entropy Rate
  • Relative entropy rate :
  • We get :
index of coincidence
Index of Coincidence
  • Take two realizations Y,Y’ (of length n) of the same HMP. What is the probability that they are equal ?

Exponentially decaying with n.

  • We get :
  • Similarly, we can solve for three and four (but not five) realizations. Can give bounds on the entropy rate.
future directions
Future Directions
  • Proving conjectures
  • Generalizations (e.g. any alphabets, continuous case)
  • Other regimes
  • Relative Entropy of two HMPs

Thank You