1 / 20

Rohit Kate

Computational Intelligence in Biomedical and Health Care Informatics HCA 590 (Topics in Health Sciences). Rohit Kate. Bayesian Networks : Application. Reading. Bayesian Networks for Cardiovascular Monitoring

cosima
Download Presentation

Rohit Kate

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Computational Intelligence in Biomedical and Health Care InformaticsHCA 590 (Topics in Health Sciences) Rohit Kate Bayesian Networks: Application

  2. Reading • Bayesian Networks for Cardiovascular Monitoring Jennifer M. Roberts, Tushar A. Parlikar, Thomas Heldt, and George C. Verghese. In proceedings of the 28th IEEE EMBS Annual International Conference, 2006, pages 205-209. (skip Section III) • Tutorial to create Bayesian Networks in GeNIe

  3. Example Application in Medicine • Paper: Bayesian Networks for Cardiovascular Monitoring Jennifer M. Roberts, Tushar A. Parlikar, Thomas Heldt, and George C. Verghese. In proceedings of the 28th IEEE EMBS Annual International Conference, 2006, pages 205-209. • Why this paper? • Shows a good sample application in medicine • Not a complicated Bayesian network • You may read the following for a more general overview: Bayesian Networks in Biomedicine and Health-care Editorial, P.J. Lucas, L.C. van derGaag, and A. Abu-Hanna Artificial Intelligence in Medicine. 2004 Mar;30(3):201-14.

  4. Cardiovascular Model • Goal: Predict: • stroke volume, cardiac output and total peripheral resistance which are not easy to measure Using evidence: • heart rate and blood pressure which are easy to measure • Authors built cardiovascular model using a Bayesian network Advantage: • Does not need to know all the evidences to make predictions • Probabilistic framework lets one work with unreliable evidences

  5. Bayesian Network Stroke Volume (SV) Heart Rate (HR) Cardiac Output (CO) Total Peripheral Resistance (TPR) Blood Pressure (BP)

  6. Underlying Joint Probability Distribution In a Bayesian network, given the values of all the parents, a node is conditionally independent of all its non-descendents Independence assumptions: • Heart rate, stroke volume and total peripheral resistance are independent of each other • Given heart rate and stroke volume, cardiac output is independent of total peripheral resistance • Given cardiac output and total peripheral resistance, blood pressure is independent of heart rate and stroke volume Hence: P(SV,BP,TPR,CO,HR) = P(HR)*P(SV)*P(TPR)*P(CO|HR,SV) *P(BP|TPR,CO)

  7. Parameters • Priors: P(HR), P(SV) and P(TPR) • Conditional probability tables: P(CO|HR,SV) and P(BP|CO,TPR) • All these numerical quantities were discretized into five levels (Bayesian networks can also model random variables which take continuous numerical values) • They were initialized to a standard probability distribution known as Dirichlet distribution (a kind of smoothing)

  8. Parameter Estimation • In the first study they generated simulated patient data (Section III) • In the second study they used data collected from real patients (MIMIC II database, Saeed et al. 2002) • Parameters were estimated using data from past 5000 beats

  9. Results • Tested sequentially: Using parameters estimated from time 0 to t predict CO, SV and TPR at time t+1 given HR and BP at time t+1 • Results (accuracy): TPR: 99.7% CO: 82% SV: 87.5% Satisfactory.

  10. Bayesian Network Software • There are several software available for Bayesian networks • GeNIe from University of Pittsburgh (http://genie.sis.pitt.edu/): • Freely available (http://genie.sis.pitt.edu/index.php/downloads) • Well documented (http://genie.sis.pitt.edu/wiki/GeNIe_Documentation) • Easy to use

  11. GeNIe: Creating a Bayesian Network • Download and install • Works only on Windows • Create nodes for random variables using the yellow ellipse icon • Link them using the arrow icon and going from one node to another with the mouse pressed • One can change the names of the states of the variables, right click on a node for “Node Properties” ->Definition • Assign priors and CPT probabilities, right click on a node for “Node Properties”->Definition; make sure they add to 1 • Save it as .xdsl file

  12. GeNIe: Querying • One can set values to some random variables as evidence (right click “set evidence”) • Click on the yellow “lightning” icon to update the network • One can then see the probabilistic values of other random variables by bringing the cursor on the small checked squares of the nodes • See the sample network at http://www.uwm.edu/~katerj/courses/cibhi/BEAJM.xdsl (save and open using GeNIe), based on the burglary & earthquake example

  13. Sample Inferences from GeNIe • Diagnostic (evidential, abductive): From effect to cause. • P(Burglary | JohnCalls): 0.016 • P(Burglary | JohnCalls, MaryCalls): 0.284 • P(Alarm | JohnCalls, MaryCalls): 0.761 • P(Earthquake | JohnCalls, MaryCalls): 0.176 • Causal (predictive): From cause to effect • P(JohnCalls | Burglary) : 0.849 • P(MaryCalls | Burglary): 0.659 • Intercausal (explaining away): Between causes of a common effect. • P(Burglary | Alarm): 0.374 • P(Burglary | Alarm, Earthquake): 0.003 • Mixed: Two or more of the above combined • (diagnostic and causal) P(Alarm | JohnCalls, ¬Earthquake): 0.034 • (diagnostic and intercausal) P(Burglary | JohnCalls, ¬Earthquake): 0.016

  14. Number of Entries in a CPT • For n incoming nodes, one needs 2n entries in its CPT (assuming random variables take binary values) • Hence the number of entries grow exponentially as the number of incoming edges grow • Estimating all these parameters becomes a problem if there are large number of incoming nodes

  15. Noisy-Or • Noisy-Or assumption on conditional probabilities of a node reduces the number of CPT entries required to be specified to n • A way of combining probabilities from multiple sources • Assumption: Any of the parent node being true leads to the the child node being true (analogous to logical OR)

  16. Noisy-Or Example • P(fever|cold) = 0.8 • P(fever|pneumonia) = 0.9 • P(fever|chicken-pox) = 0.7 • If you have cold or pneumonia or chicken-pox, there is a good chance you have fever • What is the probability of not having fever if you have all of them? • You will have to escape fever from cold and pneumonia and chicken-pox

  17. Noisy-Or Example P(not fever|cold) = 1-0.8 P(not fever|pneumonia) = 1-0.9 P(not fever|chicken-pox)=1-0.7 Assuming noisy-or: P(not fever|cold, pneumonia,chicken-pox) = (1-0.8)*(1-0.9)*(1-0.7) P(fever|cold,pneumonia,chicken-pox)= 1-(1-0.8)*(1-0.9)(1-0.7)=0.994

  18. Noisy-Or in CPTs Pneumonia Chicken-pox Cold Fever

  19. Noisy-Or in CPTs Specify Computed Requires specifying only 3 (n) parameters instead of 8 (2n). If all the P(fever|X) probabilities were 1 then it will be exactly like logical OR, but since they are not all 1, this is called noisy-or.

  20. Other Mechanisms • If random variables are not binary, an equivalent mechanism is called noisy-max • In certain situations another type of assumption is used, called noisy-and, which is analogous to logical AND, i.e. a child node is true only if all the parent nodes are true

More Related