1 / 15

# Probabilistic reasoning over time - PowerPoint PPT Presentation

Probabilistic reasoning over time. This sentence is likely to be untrue in the future!. The basic problem. What do we know about the state of the world now given a history of the world before . The only evidence we have are probabilities.

Related searches for Probabilistic reasoning over time

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'Probabilistic reasoning over time' - annick

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Probabilistic reasoning over time

This sentence is likely to be untrue in the future!

• What do we know about the state of the world now given a history of the world before.

• The only evidence we have are probabilities.

• “Past performance may not be a guide to future performance.”

• States are our “events”.

• (Partial) states can be measured at reasonable time intervals.

• Xt unobservable state variables at t.

• Et (“evidence”) observable state variables at t.

• Vm:n : Variables Vm, Vm+1,…,Vn

• Stationary: the laws of probability don’t change over time

• Markovian: current unobservalbe state depends on a finite number of past states

• First-order: current state depends only on the previous state, i.e.:

• P(Xt|X0:t-1)=P(Xt|Xt-1)

• Second-order: etc., etc.

• Observable variables depend only on the current state (by definition, essentially), these are the “sensors”.

• The current state causes the sensor values.

• P(Et|X0:t,E0:t-1)=P(Et|Xt)

• What is P(X0)?

• At time t, the joint is completely determined:

• P(X0,X1,…Xt,E1,…,Et) =P(X0) • ∏i  t P(Xi|Xi-1)P(Ei|Xi)

• More state variables (temperature, humidity, pressure, season…)

• Higher order Markov processes (take more of the past into account).

• Belief/monitoring the current state

• Prediction about the next state

• Explanation of possible causes

• Further simplification:

• Only one state variable.

• We can use matrices, now.

• Ti,j = P(Xt=j|Xt-1=i)

• P(words|signal) = P(signal|words)P(words)

• P(words) “language model”

• “Every time I fire a linguist, the recognition rate goes up.”

• Sample the speech signal

• Decide the most likely sequence of speech symbols

• Phonemes: minimal units of sound that make a meaning difference (beat vs. bit; fit vs. bit)

• Phones: normalized articulation results paid vs. tap