csce 582 computation of the most probable explanation in bayesian networks using bucket elimination n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
- Hareesh Lingareddy University of South Carolina PowerPoint Presentation
Download Presentation
- Hareesh Lingareddy University of South Carolina

Loading in 2 Seconds...

play fullscreen
1 / 13

- Hareesh Lingareddy University of South Carolina - PowerPoint PPT Presentation


  • 108 Views
  • Uploaded on

CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination. - Hareesh Lingareddy University of South Carolina. Bucket Elimination.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about '- Hareesh Lingareddy University of South Carolina' - fionan


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
csce 582 computation of the most probable explanation in bayesian networks using bucket elimination

CSCE 582Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination

-HareeshLingareddy

University of South Carolina

bucket elimination
Bucket Elimination
  • Algorithmic framework that generalizes dynamic programming to accommodate algorithms for many complex problem solving and reasoning activities.
  • Uses “buckets” to mimic the algebraic manipulations involved in each of these problems resulting in an easily expressible algorithmic formulation
bucket elimination algorithm
Bucket Elimination Algorithm
  • Partition functions on the graph into “buckets” in backwards relative to the given node order
  • In the bucket of variable X we put all functions that mention X but do not mention any variable having a higher index
  • Process buckets backwards relative to the node order
  • The computed function after elimination is placed in the bucket of the ‘highest’ variable in its scope
algorithms using bucket elimination
Algorithms using Bucket Elimination
  • Belief Assessment
  • Most Probable Estimation(MPE)
  • Maximum A Posteriori Hypothesis(MAP)
  • Maximum Expected Utility(MEU)
belief assessment
Belief Assessment
  • Definition

- Given a set of evidence compute the posterior probability of all the variables

    • The belief assessment task of Xk = xk is to find
  • In the Visit to Asia example, the belief assessment problem answers questions like
    • What is the probability that a person has tuberculosis, given that he/she has dyspnoeaand has visited Asia recently ?
  • where k – normalizing constant
belief assessment overview
Belief Assessment Overview
  • In reverse Node Ordering:
    • Create bucket function by multiplying all functions (given as tables) containing the current node
    • Perform variable elimination using Summation over the current node
    • Place the new created function table into the appropriate bucket
most probable explanation mpe
Most Probable Explanation (MPE)
  • Definition
    • Given evidence find the maximum probability assignment to the remaining variables
    • The MPE task is to find an assignment xo = (xo1, …, xon) such that
differences from belief assessment
Differences from Belief Assessment
  • Replace Sums With Max
  • Keep track of maximizing value at each stage
  • “Forward Step” to determine what is the maximizing assignment tuple
elimination algorithm for most probable explanation
Elimination Algorithm for Most Probable Explanation

Finding MPE = max ,,,,,,, P(,,,,,,,)

MPE= MAX{,,,,,,,} (P(|)* P(|)* P(|,)* P(|,)* P()*P(|)*P(|)*P())

Bucket :

P(|)*P()

Hn(u)=maxxn (ПxnFnC(xn|xpa))

Bucket :

P(|)

Bucket :

P(|,), =“no”

Bucket :

P(|,)

H(,)

H()

Bucket :

P(|)

H(,,)

Bucket :

P(|)*P()

H(,,)

Bucket :

H(,)

Bucket :

H()

H()

MPE probability

elimination algorithm for most probable explanation1
Elimination Algorithm for Most Probable Explanation

Forward part

’ = arg maxP(’|)*P()

Bucket :

P(|)*P()

Bucket :

P(|)

’ = arg maxP(|’)

Bucket :

P(|,), =“no”

’ = “no”

Bucket :

P(|,)

H(,)

H()

’ = arg maxP(|’,’)*H(,’)*H()

Bucket :

P(|)

H(,,)

’ = arg maxP(|’)*H(’,,’)

Bucket :

P(|)*P()

H(,,)

’ = arg maxP(’|)*P()* H(’,’,)

Bucket :

H(,)

’ = arg maxH(’,)

Bucket :

H()

H()

’ = arg maxH()* H()

Return: (’, ’, ’, ’, ’, ’, ’, ’)

mpe overview
MPE Overview
  • In reverse node Ordering
    • Create bucket function by multiplying all functions (given as tables) containing the current node
    • Perform variable elimination using the Maximization operation over the current node (recording the maximizing state function)
    • Place the new created function table into the appropriate bucket
  • In forward node ordering
    • Calculate the maximum probability using maximizing state functions
maximum aposteriori hypothesis map
Maximum Aposteriori Hypothesis (MAP)
  • Definition
    • Given evidence find an assignment to a subset of “hypothesis” variables that maximizes their probability
    • Given a set of hypothesis variables A = {A1, …, Ak}, ,the MAP taskis to find an assignment

ao = (ao1, …, aok) such that