1 / 29

Bayesian networks and their application in circuit reliability estimation

Bayesian networks and their application in circuit reliability estimation. Erin Taylor. What is a Bayesian Network?. An example We want to describe the causal relationship between the following events: 1) The season 2) Whether it is raining outside 3) The sprinkler is on

dorcas
Download Presentation

Bayesian networks and their application in circuit reliability estimation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayesian networks and their application in circuit reliability estimation Erin Taylor

  2. What is a Bayesian Network? • An example We want to describe the causal relationship between the following events: 1) The season 2) Whether it is raining outside 3) The sprinkler is on 4) The sidewalk is wet 5) The sidewalk is slippery • We can construct a graph to represent the causal link between these 5 events.

  3. What is a Bayesian Network? Bayesian network Assumptions: • “Sprinkler on” and “Rain” are determined by “Season” • “Sidewalk wet” is determined by “Sprinkler on” and “Rain” • “Sidewalk slippery” is determined by “Sidewalk wet” • Each node represents a random variable, in this case the probability of a particular event. Season Sprinkler Rain Wet Slippery

  4. Properties of Bayesian Networks • A Bayesian network is • A Directed Acyclic Graph (DAG) • A model of probabilistic events • In a Bayesian network • Nodes represent random variables of interest • Links represent causal dependencies among variables • Bayesian networks are direct representations of the world. Arrows indicate real causal connections, not the flow of information, as in neural networks.

  5. Properties of Bayesian Networks Season • Links are not absolute • If the sprinkler is on, this does not always mean that the sidewalk is wet • For example, the sprinkler may be aimed away from the sidewalk Sprinkler Rain Wet Slippery

  6. Properties of Bayesian Networks • Given that the sidewalk is wet, we can calculate the probability that the sprinkler is on: P(sprinkler on | sidewalk wet) Season Sprinkler Rain Wet Slippery • Bayesian networks allow us to calculate such values from a small set of probabilities in a process called reasoning or Bayesian Inference

  7. Reasoning in Bayesian Networks • Reasoning in Bayesian networks operates by propagating information in any direction • If the sprinkler is on, the sidewalk is probably wet (prediction) • 2) If the sidewalk is wet, it is more likely that the sprinkler is on or it is raining (abduction) • 3) If the sidewalk is wet and the sprinkler is on, the likelihood that it is raining is reduced (explaining away) Season Sprinkler Rain Wet Slippery • Explaining away is a special type of reasoning that is especially difficult to model in other network models

  8. Specifying a Bayesian Network • A new example: • When a family leaves their house, they often turn the front light on and let the dog out • If the dog is dirty, the family often puts him outside • If the dog is out, you can sometimes hear him bark Family out Dog dirty Light on Dog out Hear bark

  9. Specifying a Bayesian Network • To specify the probability distribution of a Bayesian network we need • The prior probability of all root nodes • The conditional probabilities of all nonroot nodes given all possible combinations of their direct predecessors P(dd) = 0.01 P(fo) = 0.15 Total specified values: 10 Family out Dog dirty P(do | fo dd) = 0.99 P(do | fo ~dd) = 0.90 P(do | ~fo dd) = 0.97 P(do | ~fo ~dd) = 0.3 Light on Dog out P(lo | fo) = 0.6 P(lo | ~fo) = 0.05 Hear bark P(hb | do) = 0.7 P(hb | ~do) = 0.01

  10. Bayesian Networks and Probability Theory • In traditional probability theory, specifying the previous example would require the joint distribution of all 5 variables: P(fo,dd,lo,do,hb) Family out Dog dirty Light on Dog out Hear bark • The joint distribution of 5 variables requires 25-1 or 31 values

  11. Bayesian Networks and Probability Theory • To see where 25-1 comes from, consider the set of Boolean variables (a,b) • To specify the joint probability distribution we need the following values: • In the general case, this yields a total of 2n values for a system of n variables • Since the sum of all possible outcomes must be 1, we can reduce the number of values to 2n -1 • P(a b) P(~a b) • P(a ~b) P(~a ~b)

  12. Bayesian Networks and Joint Probabilities • Using Bayesian networks for this example, we can reduce the number of values that need to be specified from 31 to 10. Family out Dog dirty P(fo,dd,lo,do,hb) Light on Dog out Hear bark Total specified values: 31 Total specified values: 10 How is this possible?

  13. Simplifying Joint Distributions • Bayesian networks reduce the complexity of joint distributions by introducing several independence assumptions • Conditional Independence: • If we know whether the dog is out, then the probability of hearing him bark is completely independent from all other events • Other events only serve to indicate the probability that the dog is out P(hb | fo dd lo do) = P(hb | do) • Also P(dd | fo lo do hb) = P(dd) Family out Dog dirty Light on Dog out Hear bark Prior probability

  14. Simplifying Joint Distributions • From probability theory: P(x1,…,xn) = P(x1)P(x2|x1)…P(xn|x1…xn-1) • In our example: P(fo,dd,lo,do,hb) = P(fo) P(dd|fo) P(lo|fo dd) P(do|fo dd lo) P(hb|fo dd lo do) • Simplify: P(dd|fo) = P(dd) P(lo|fo dd) = P(lo|fo) P(do|fo dd lo) = P(do|fo dd) P(hb|fo dd lo do) = P(hb|do) P(fo) P(dd) P(lo|fo) P(do|fo dd) P(hb|do) Family out Dog dirty Light on Dog out Hear bark

  15. Simplifying Joint Distributions • Only the five terms on the right side are needed to specify the joint distribution of our example: P(fo,dd,lo,do,hb) = P(fo) P(dd) P(lo|fo) P(do|fo dd) P(hb|do) • The number of values that need to be specified in a Bayesian network grows linearly with the number of variables whereas the joint distribution grows exponentially Prior probabilities of root nodes Conditional probabilities of nonroot nodes given all combinations of predecessors

  16. Evaluating Probabilities Using BN’s • Basic computation on Bayesian networks is the computation of every node’s belief (conditional probability) given the evidence observed • For example • Evidence: The dog is heard barking • Compute: The probability that the family is out • Compute: The probability that the light is on Family out Dog dirty Light on Dog out Hear bark

  17. Evaluating Probabilities Using BN’s • Solving Bayesian networks involves Bayesian inference • Exact Solution • Involves enumerating all possible probability combinations • Generally NP-Hard • Simple Query: P(fo = true | hb = true)

  18. Evaluating Probabilities Using BN’s • Approximate Solutions • Logic sampling • Markov chain Monte Carlo algorithm • Likelihood weighting method • General approach to approximate solutions • Select values for a subset of nodes • Use this ‘evidence’ to pick values for remaining nodes • Keep statistics on all the nodes’ values

  19. Logic Sampling • Logic Sampling Algorithm 1) Guess values for all roots nodes according to prior probability P(fo=0.15) -> 15% of time fo=true 2) Work down network guessing values for next lower node based on parent values Previous values: fo=true and dd=false P(do|fo ~dd)=0.90 -> 90% of time do=true 3) Repeat many times for entire network and keep track of how often each node is assigned each value P(fo) = 0.15 Family out Dog dirty Light on Dog out Hear bark P(do | fo ~dd) = 0.90 4) To determine a conditional probability, P(fo=true | hb=true), consider cases when hb=true and count the number of times fo=true.

  20. Applications • Bayesian networks were popularized by the Artificial Intelligence community who used them as a learning algorithm • Used to model trust in a P2P network “Bayesian network-based trust model in P2P networks” by Wang and Vassileva • Used to evaluate circuit reliability “Scalable probabilistic computing models using Bayesian networks” by Rejimon and Bhanja OR “If I see a red object, what is the probability that I should stop?”

  21. BN’s for Circuit Reliability • Circuit Example Z1 Y1 X1 Z2 X2 Y2 Z3 Inputs: Z1, Z2, Z3 Internal Signals: X1, X2 Outputs: Y1, Y2

  22. BN’s for Circuit Reliability • Goal: Analyze circuit reliability in the face of dynamic errors • Procedure: • Construct an error prone version of the circuit where each gate has a probability of failure = p • Analyze this circuit in relation to the fault-free circuit Z1 p Ye1 Xe1 Z2 Xe2 p Ye2 Z3

  23. BN’s for Circuit Reliability • Error at the ith output can be represented mathematically: Ei = YeiÅYi P(Ei = 1) = P(YeiÅ Yi =1) • Equivalent circuit representation Z1 Y1 X1 E1 Z2 X2 Y2 Z3 p Ye1 Xe1 E2 Xe2 p Ye2

  24. BN’s for Circuit Reliability • In a circuit, each gate output has a causal relationship with its input Þcircuits can be represented as Bayesian networks • In the Bayesian network representation of a circuit • Inputs are root nodes • Outputs are leaf nodes • Internal signals are internal nodes • Each node’s conditional probability is determined by the gate type and probability of error, p

  25. BN’s for Circuit Reliability Z1 Y1 X1 E1 Z2 X2 Y2 Z3 Bayesian network p Ye1 Xe1 E2 Z1 Z2 Z3 Xe2 p Ye2 X1 Xe1 X2 Xe2 Circuit Ye1 Ye2 Y1 Y2 E1 E2

  26. BN’s for Circuit Reliability • Specifying the Bayesian network • Prior probabilities for all root nodes Circuit inputs • Conditional probabilities for all nonroot nodes given all possible combinations of parents Z1 Y1 X1 • Ye1 is the output of NAND gate • with inputs Z1 and Xe1 • P(Ye1=1| Z1=0, Xe1=0) = (1-p) E1 Z2 X2 Y2 Z3 p Ye1 Xe1 E2 Probability of no gate error Xe2 p Ye2

  27. BN’s for Circuit Reliability • Solving for the error probabilities, E1 and E2 • Error probability is fixed (p = 0.005, 0.05, 0.1) • Logic sampling algorithm is used • Determines the probability of output error given each input combination • Results • Circuits with 2,000-3,000 gates took on average 18 s to simulate • Average accuracy: 0.7% • Worst-case accuracy: 3.34% • Compare this to an accurate method which takes ~1,000 s to simulate simple circuits with only 10’s of gates Z2 Z3 Z1 Xe1 X2 X1 Xe2 Ye1 Ye2 Y1 Y2 E1 E2

  28. Advanced Subjects in BN’s • Dynamic Bayesian networks • Models variables whose values change over time • Captures this process by representing each state variable as multiple copies, one for each time step • Learning in Bayesian networks • The conditional probabilities of each node can be updated continuously • Similar to the way weights are adjusted in neural networks

  29. Conclusions • Bayesian networks are a powerful tool for modeling any probabilistic system • Applications are diverse • Medical field • Image processing • Speech recognition • Computer networking • Used for efficient evaluation of nanoscale circuit reliability

More Related