Cs498 ea reasoning in ai lecture 5
Download
1 / 13

CS498-EA Reasoning in AI Lecture #5 - PowerPoint PPT Presentation


  • 101 Views
  • Uploaded on

CS498-EA Reasoning in AI Lecture #5. Instructor: Eyal Amir Fall Semester 2009. Last Time. Propositional Logic Inference in different representations CNF: SAT hard; small representation sometimes DNF: SAT easy; large representation OBDDs: SAT easy; large representation sometimes

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' CS498-EA Reasoning in AI Lecture #5' - carney


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Cs498 ea reasoning in ai lecture 5

CS498-EAReasoning in AILecture #5

Instructor: Eyal Amir

Fall Semester 2009


Last time
Last Time

  • Propositional Logic

    • Inference in different representations

    • CNF: SAT hard; small representation sometimes

    • DNF: SAT easy; large representation

    • OBDDs: SAT easy; large representation sometimes

    • NNF: SAT hard; fewest large representations

  • Applications:

    • Circuit and program verification; computational biology


Pop quiz 5 min
Pop Quiz (5 min)

  • Prove or disprove:

    • Every CNF representation with n variables of propositional formulas takes O(2n) space to represent some propositional theories on n variables (hint: how many non-equivalent theories of n variables are there?)

  • Give me your answer; NO IMPACT on your final score in this class


Today
Today

  • Probabilistic graphical models

  • Treewidth methods:

    • Variable elimination

    • Clique tree algorithm

  • Applications du jour: Sensor Networks


Probability
Probability

  • A sample space Omega (O) is a set of outcomes of a random experiment

  • A probability P is a function from a sigma-field (e.g., all measurable subsets) A on O (the events) to [0,1].

  • A random variable X is a function X:OR such that for all B Borel set in R, X-1(B) is in A.


Independent random variables
Independent Random Variables

  • Two variables X and Y are independent if

    • P(X = x|Y = y) = P(X = x) for all values x,y

    • That is, learning the values of Y does not change prediction of X

  • If X and Y are independent then

    • P(X,Y) = P(X|Y)P(Y) = P(X)P(Y)

  • In general, if X1,…,Xp are independent, then P(X1,…,Xp)= P(X1)...P(Xp)

    • Requires O(n) parameters


Conditional independence
Conditional Independence

  • Unfortunately, most of random variables of interest are not independent of each other

  • A more suitable notion is that of conditional independence

  • Two variables X and Y are conditionally independent given Z if

    • P(X = x|Y = y,Z=z) = P(X = x|Z=z) for all values x,y,z

    • That is, learning the values of Y does not change prediction of X once we know the value of Z

    • notation: I ( X , Y | Z )


Example family trees

Marge

Homer

Lisa

Maggie

Bart

Example: Family trees

Noisy stochastic process:

Example: Pedigree

  • A node represents an individual’sgenotype

  • Modeling assumptions:

    • Ancestors can effect descendants' genotype only by passing genetic materials through intermediate generations


Markov assumption

Y1

Y2

X

Non-descendent

Markov Assumption

Ancestor

Parent

  • We now make this independence assumption more precise for directed acyclic graphs (DAGs)

  • Each random variable X, is independent of its non-descendents, given its parents Pa(X)

  • Formally,I (X, NonDesc(X) | Pa(X))

Non-descendent

Descendent


Markov assumption example

Burglary

Earthquake

Radio

Alarm

Call

Markov Assumption Example

  • In this example:

    • I ( E, B )

    • I ( B, {E, R} )

    • I ( R, {A, B, C} | E )

    • I ( A, R | B,E )

    • I ( C, {B, E, R} | A)


I maps

X

Y

X

Y

I-Maps

  • A DAG G is an I-Map of a distribution P if all Markov assumptions implied by G are satisfied by P

    (Assuming G and P both use the same set of random variables)

    Examples:


Factorization

X

Y

Factorization

  • Given that G is an I-Map of P, can we simplify the representation of P?

  • Example:

  • Since I(X,Y), we have that P(X|Y) = P(X)

  • Applying the chain ruleP(X,Y) = P(X|Y) P(Y) = P(X) P(Y)

  • Thus, we have a simpler representation of P(X,Y)



ad