1 / 17

Introducing Bayesian Networks in Neuropsychological Measures

Introducing Bayesian Networks in Neuropsychological Measures. Presented at 2006 APS Annual Conference Toshi Yumoto University of Maryland, Abt Associate Gregory Anderson Xtria, Adler School of Professional Psychology Daisy Wise University of Maryland. Bayesian Networks.

hestia
Download Presentation

Introducing Bayesian Networks in Neuropsychological Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introducing Bayesian Networks in Neuropsychological Measures Presented at 2006 APS Annual Conference Toshi Yumoto University of Maryland, Abt Associate Gregory Anderson Xtria, Adler School of Professional Psychology Daisy Wise University of Maryland

  2. Bayesian Networks It is very difficult for practitioners to combine this information objectively • Paul Meehl, Clinical versus Statistical Prediction (1954) • Bayesian Networks provide an effective way to combine different sources of information such as: • Information from distinct domains • Demographic (e.g. gender, race/ethnicity) and Clinical Information (e.g. previous diagnosis, different type of tests) • It is easy to add new information or modify existing information

  3. Bayesian Network (2) • A Bayes Net is suitable for discrete data • Continuous data needs to be converted into categorical data • Complex models can be divided into several conditionally independent parts • Model estimation is easier • Straightforward to add conditional independent data • Bayesian Network is visual and intuitive • Class assignments are expressed as a proportion (e.g. percent) • Provides a prediction for all parameters based on limited (or no) information • Allows both subjective and objective evaluation of a network • Microsoft Belief Networks (MSBNx, Kadie, Hovel, Horvitz, 2001) was used to build a Bayesian Network

  4. Steps in Network Development • Part I • Conversion of continuous scores to discrete categories by Finite Mixture Approach • Part II • Create Bayesian Network based on hypothesized model • Estimate conditional probabilities • This research used a Latent Class Model • Part III • Examine and Modify model

  5. Measures and Sample • The ATLAS is a comprehensive assessment for ADHD containing 7 different sections (Anderson & Post, 2006). One of the sections is a series of neuropsychological measures and observations of performance during testing. • This paper examines three of the major traits of the neuropsychological measures for ADHD. • Diagnoses of ADHD & LD were gathered from a parent report. • The sample is around 220 subjects, 8-18 years of age, from across the nation, gathered in the test field trials.

  6. Creation of Discrete scores • The assumption was made that test scores are from more than one distribution • A Finite Mixture Model was therefore utilized • Cut scores were established using the intersection of the distributions • # of mixture distribution = # of cut scores + 1 • A discrete score was assigned based on a person’s mixture distribution characteristic • For more information contact authors.

  7. Finite Mixture Analysis for Trail A Cut Score

  8. Hypothesized Model

  9. Model Specification • Three Distinct Domains • Visual Trace/Sequence • Three Latent Classes • Memory • Three Latent Classes • Impulsive/Error • Two Latent Classes • Second Order Latent Class • Four Latent Classes • Diagnosis (LD/ADHD) • Four manifest categories • Typical, LD, ADHD, and LD/ADHD

  10. Specification of the Bayesian Network • For each domain conditional probability of indicator variables are specified given latent class membership (for that domain). • These probabilities are first specified based on the assumption that we have no information. • These are then updated, given information such as test scores or clinical observations.

  11. Partial Model: Visual Trace NetworkNo information known

  12. Partial Model: Visual Trace NetworkAll item scores known

  13. Probability of LD/ADHD state given • Middle Visual Trace level • Middle Memory level • High Impulsive level • SOClass and LD/ADHD states are unobserved • Proportions are expected class states

  14. Probability of LD/ADHD state with Six test results • Trail A time (middle) and error (middle) • Trail B time (middle-low) and error (middle) • Word Memory 1 (low) and Word Memory 3 (low) • Other nodes have expected category distribution • Highest probability indicates most likely category

  15. It is easy to combine additional information such as clinical observations and gender to improve model prediction. • Clinical observation is conditionally independent from other nodes given LD/ADHD states (i.e. only affect the probability of LD/ADHD node) • Gender has direct effect on LD/ADHD states and Impulsive/Error level, which indirectly affect second order class states. • This type of information is harder to add later and should be included from the beginning, if appropriate.

  16. Summary • Bayesian Networks provide an effective way to express and examine hypothesized models. • The model performance can be compared with precision of prediction (e.g. LD/ADHD diagnoses in this research). • Any statistical procedures estimating expected scores (i.e. probability of responses) may be used to build a network. • Bayes Net uses discrete data, therefore latent class model and latent trait model (with discrete proficiency levels) nicely fit model development. • Combining additional information is straight forward and relatively easy • Understanding of conditional independence is the key • Bayes Net estimates expected probability from available information • Makes best possible diagnosis without complete data

  17. Contacts • Toshi Yumoto • fyumoto@umd.edu • Gregory Anderson • ganderson@xtria.com • Daisy Wise • dawise@umd.edu

More Related