1 / 55

Lecture 2.2 Models for motion tracking

Lecture 2.2 Models for motion tracking. CMSC 818W : Spring 2019. Tu-Th 2:00-3:15pm CSI 2118. Nirupam Roy. Feb. 12 th 2019. Happy or sad?. Happy or sad?. Happy or sad?. Happy or sad?. Past experience. P (The dolphin is happy | Experience). Inference from sensor data.

marisa
Download Presentation

Lecture 2.2 Models for motion tracking

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 2.2 Models for motion tracking CMSC 818W : Spring 2019 Tu-Th 2:00-3:15pm CSI 2118 Nirupam Roy Feb. 12th 2019

  2. Happy or sad?

  3. Happy or sad?

  4. Happy or sad?

  5. Happy or sad? Past experience P (The dolphin is happy | Experience)

  6. Inference from sensor data Tracking arm motion

  7. Inference from sensor data Tracking arm motion Partial information from sensors

  8. Inference from sensor data Tracking arm motion Knowledge about the arm’s motion (probabilistic models) + Partial information from sensors

  9. Inference from sensor data Tracking arm motion Knowledge about the arm’s motion (probabilistic models) Arm’s gesture and movement tracking + Partial information from sensors

  10. Probability refresher

  11. A few basic probability rules Joint probability: P (A, B) = P (B, A) Likelihood of multiple events occurring simultaneously

  12. A few basic probability rules Joint probability: P (A, B) = P (B, A) Likelihood of multiple events occurring simultaneously Conditional probability: P (A|B) = P(A,B)/P(B) Probability of an event, given another event has occurred

  13. A few basic probability rules Joint probability: P (A, B) = P (B, A) Likelihood of multiple events occurring simultaneously Conditional probability: P (A|B) = P(A,B)/P(B) Probability of an event, given another event has occurred Marginal probability: P (B) = Probability of the occurrence of an event, unconditioned on other

  14. A few basic probability rules Joint probability: P (A, B) = P (B, A) Likelihood of multiple events occurring simultaneously Conditional probability: P (A|B) = P(A,B)/P(B) Probability of an event, given another event has occurred Marginal probability: P (B) = Probability of the occurrence of an event, unconditioned on other A2 A3 A1 B

  15. A few basic probability rules Joint probability: P (A, B) = P (B, A) Likelihood of multiple events occurring simultaneously Conditional probability: P (A|B) = P(A,B)/P(B) Probability of an event, given another event has occurred Marginal probability: P (B) = Probability of the occurrence of an event, unconditioned on other Chain rule: P (A, B, C) = P(A|B,C) P(B,C) = P(A|B,C) P(B|C) P(C) Probability of the occurrence of an event, unconditioned on other

  16. Bayes rule Bayes rule: P (Ai|B) =

  17. Bayes rule Likelihood Posterior Prior Bayes rule: P (Ai|B) =

  18. Bayes rule Likelihood Posterior Prior Bayes rule: P (Ai|B) = =

  19. Bayes rule Likelihood Posterior Prior Bayes rule: P (Ai|B) = = =

  20. Bayes rule Likelihood Posterior Prior Bayes rule: P (Ai|B) = = = Relates inverse representations of the probabilities concerning two events

  21. Bayes rule Diabetes: A2 Cancer: A3 Healthy: A1 Smoker: B Likelihood Posterior Prior Bayes rule: P (Ai|B) = = = Relates inverse representations of the probabilities concerning two events

  22. Markov Model Sunny Sunny Rainy Sunny Rainy Rainy t1 t2 t3 t4 t5 t6 (time)

  23. Markov Model Sunny Sunny Rainy Sunny Rainy Rainy t1 t2 t3 t4 t5 t6 (time) Sunny day Rainy day

  24. Markov Model Sunny Sunny Rainy Sunny Rainy Rainy t1 t2 t3 t4 t5 t6 (time) P (sunny after rainy) P (sunny after sunny) P (rainy after rainy) Sunny day Rainy day P (rainy after sunny)

  25. Markov Model Sunny Sunny Rainy Sunny Rainy Rainy t1 t2 t3 t4 t5 t6 (time) P (sunny after rainy) P (sunny after sunny) P (rainy after rainy) The future depends on the present only, not on the past Sunny day Rainy day P (rainy after sunny)

  26. Markov Model Sunny Sunny Rainy Sunny Rainy Rainy t1 t2 t3 t4 t5 t6 (time) P (sunny after rainy) P (sunny after sunny) P (rainy after rainy) Sunny day Rainy day P (rainy after sunny)

  27. Markov Model Sunny Sunny Rainy Sunny Rainy Rainy t1 t2 t3 t4 t5 t6 (time) P (sunny after rainy) P (sunny after sunny) P (rainy after rainy) Sunny day Rainy day Temp. P (rainy after sunny) Temp. Humidity Humidity Wind Wind

  28. Hidden Markov Model

  29. Hidden Markov Model: Toy robot localization example Observations = sensor measurement Find location of the robot (Hidden information) S1 S2 S3 …

  30. Hidden Markov Model: Toy robot localization example t=0 Prob 0 1

  31. Hidden Markov Model: Toy robot localization example t=1 Prob 0 1

  32. Hidden Markov Model: Toy robot localization example t=1 Prob 0 1

  33. Hidden Markov Model: Toy robot localization example t=2 Prob 0 1

  34. Hidden Markov Model: Toy robot localization example t=3 Prob 0 1

  35. Hidden Markov Model: Toy robot localization example t=4 Prob 0 1

  36. Hidden Markov Model: Toy robot localization example t=5 Prob 0 1

  37. Hidden Markov Model: Toy robot localization example State = location on the grid = Si S1 S2 S3 …

  38. Hidden Markov Model: Toy robot localization example Observations = sensor measurement = Mi State = location on the grid = Si S1 S2 S3 …

  39. Hidden Markov Model: Toy robot localization example Observations = sensor measurement = Mi State = location on the grid = Si S1 S2 S3 … Depends on the current state only (Emission) Si

  40. Hidden Markov Model: Toy robot localization example Observations = sensor measurement = Mi State = location on the grid = Si S1 S2 S3 … Depends on the current state only (Emission) Si Si+1 Si+2 … Si States change over time(Transition)

  41. Hidden Markov Model: Definition S1 S2 S3 S4 M1 M2 M3 M4

  42. Hidden Markov Model: Definition Hidden states S1 S2 S3 S4 M1 M2 M3 M4

  43. Hidden Markov Model: Definition Hidden states S1 S2 S3 S4 M1 M2 M3 M4 Observations

  44. Hidden Markov Model: Definition S1 S1 S2 M1 Transition Emission

  45. Hidden Markov Model: Definition S1 S1 S2 M1 Transition Emission Markov assumption: Transition probability depends on the current state only. Output independence assumption: Output/Emission probability depends on the current state only.

  46. Hidden Markov Model: Definition Probability of a sequence of hidden states, given a sequence of observations

  47. Hidden Markov Model: Definition Probability of a sequence of hidden states, given a sequence of observations

  48. Hidden Markov Model: Definition Probability of a sequence of hidden states, given a sequence of observations

  49. Hidden Markov Model: Definition Chain rule

  50. Hidden Markov Model: Definition Observation depends on the current state only Future state depends on the current state only

More Related