1 / 99

Knowledge Based 3D Medical Image Segmentation

Knowledge Based 3D Medical Image Segmentation. Tina Kapur MIT Artificial Intelligence Laboratory http://www.ai.mit.edu/~tkapur. Outline. Goal of Segmentation Applications Why is segmentation difficult? My method for segmentation of MRI Future Work. The Goal of Segmentation.

michaelotto
Download Presentation

Knowledge Based 3D Medical Image Segmentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Knowledge Based 3D Medical Image Segmentation Tina Kapur MIT Artificial Intelligence Laboratory http://www.ai.mit.edu/~tkapur tkapur@ai.mit.edu

  2. Outline • Goal of Segmentation • Applications • Why is segmentation difficult? • My method for segmentation of MRI • Future Work tkapur@ai.mit.edu

  3. The Goal of Segmentation tkapur@ai.mit.edu

  4. The Goal of Segmentation tkapur@ai.mit.edu

  5. Applications of Segmentation • Image Guided Surgery tkapur@ai.mit.edu

  6. Applications of Segmentation • Image Guided Surgery tkapur@ai.mit.edu

  7. Applications of Segmentation • Image Guided Surgery • Surgical Simulation tkapur@ai.mit.edu

  8. Applications of Segmentation • Image Guided Surgery • Surgical Simulation tkapur@ai.mit.edu

  9. Applications of Segmentation • Image Guided Surgery • Surgical Simulation • Neuroscience Studies • Therapy Evaluation tkapur@ai.mit.edu

  10. Limitations of Manual Segmentation • slow (up to 60 hours per scan) • variable (up to 15% between experts) [Warfield 95, Kaus98] tkapur@ai.mit.edu

  11. The Automatic Segmentation Challenge An automated segmentation method needs to reconcile • Gray-level appearance of tissue • Characteristics of imaging modality • Geometry of anatomy tkapur@ai.mit.edu

  12. How to Segment? i.e. Issues in Segmentation of Anatomy tkapur@ai.mit.edu

  13. How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models tkapur@ai.mit.edu

  14. How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Parametric [Vannier] • Non-Parametric [Gerig] • Point distribution Models [Cootes] • Texture [Mumford] tkapur@ai.mit.edu

  15. How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models tkapur@ai.mit.edu

  16. How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models • MRI inhomogeneity [Wells] tkapur@ai.mit.edu

  17. How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models • Anatomy Models: Shape, Geometric/Spatial tkapur@ai.mit.edu

  18. How to Segment? i.e. Issues in Segmentation of Anatomy • Tissue Intensity Models • Imaging Modality Models • Anatomy Models: Shape, Geometric/Spatial • PCA [Cootes and Taylor, Gerig, Duncan, Martin] • Landmark Based [Evans] • Atlas [Warfield] tkapur@ai.mit.edu

  19. Typical Pipeline for Segmentation of Brain MRI pre-processing (noise removal) tkapur@ai.mit.edu • Pre-processing for noise reduction • EM Segmentation • Morphological or other post-processing

  20. Typical Pipeline for Segmentation of Brain MRI pre-processing (noise removal) intensity-based classification tkapur@ai.mit.edu • Pre-processing for noise reduction • EM Segmentation • Morphological or other post-processing

  21. Typical Pipeline for Segmentation of Brain MRI pre-processing (noise removal) intensity-based classification post-processing (morphology/other) tkapur@ai.mit.edu • Pre-processing for noise reduction • EM Segmentation • Morphological or other post-processing

  22. Contributions of Thesis • Developed an integrated Bayesian Segmentation Method for MRI that incorporates de-noising and global geometric knowledge using priors into EM-Segmentation • Applied integrated Bayesian method to segmentation of Brain and Knee MRI. tkapur@ai.mit.edu

  23. Contributions of Thesis • The Priors • de-noising: novel use of a Mean-Field Approximation to a Gibbs random field in conjunction with EM-Segmentation (EM-MF) • geometric: novel statistical description of global spatial relationships between structures, used as a spatially varying prior in EM-Segmentation tkapur@ai.mit.edu

  24. Background to My Work • Expectation-Maximization Algorithm • EM-Segmentation tkapur@ai.mit.edu

  25. Expectation-Maximization • Relevant Literature: • [Dempster, Laird, Rubin 1977] • [Neal 1998] tkapur@ai.mit.edu

  26. Expectation-Maximization (what?) • Search Algorithm • for Parameters of a Model • to Maximize Likelihood of Data • Data: some observed, some unobserved tkapur@ai.mit.edu

  27. Expectation-Maximization (how?) • Initial Guess of Model Parameters • Re-estimate Model Parameters: • E Step: compute PDF for hidden variables, given observations and current model parameters • M Step: compute ML model parameters assuming pdf for hidden variables is correct tkapur@ai.mit.edu

  28. Expectation-Maximization (how exactly?) • Notation • Observed Variables: • Hidden Variables : • Model Parameters: tkapur@ai.mit.edu

  29. Expectation-Maximization (how exactly?) • Initial Guess: • Successive Estimation of • E Step: • M Step: tkapur@ai.mit.edu

  30. Expectation-Maximization • Summary/Intuition: • If we had complete data, maximize likelihood • Since some data is missing, approximate likelihood with its expectation • Converges to local maximum of likelihood tkapur@ai.mit.edu

  31. EM-Segmentation [Wells 1994] • Observed Signal is modeled as a product of the true signal and a corrupting gain field due to the imaging equipment • Expectation-Maximization is used on log-transformed observations for iterative estimation of • tissue classification • corrupting bias field (inhomogeneity correction) tkapur@ai.mit.edu

  32. EM-Segmentation [Wells 1994] E-Step M-Step tkapur@ai.mit.edu

  33. EM-Segmentation [Wells 1994] E-Step Compute tissue posteriors using current intensity correction. Estimate intensity correction using residuals based on current posteriors. M-Step tkapur@ai.mit.edu

  34. EM-Segmentation [Wells 1994] • Observed Variables • log transformed intensities in image • Hidden Variables • indicator variables for classification • Model Parameters • the slowly varying corrupting bias field ( refer to variables at voxel s in image) tkapur@ai.mit.edu

  35. EM-Segmentation [Wells 1994] • Initial Guess: • Successive Estimation of • E Step: • M Step: tkapur@ai.mit.edu

  36. EM-Segmentation [Wells 1994] • Initial Guess: • Successive Estimation of • E Step: • M Step: tkapur@ai.mit.edu

  37. Situating My Work • Prior in EM-Segmentation: • Independent and Spatially Stationary • My contribution is addition of two priors: • a spatially stationary Gibbs prior to model local interactions between neighbors (thermal noise) • spatially varying prior to model global relationships between geometry of structures tkapur@ai.mit.edu

  38. The Gibbs Prior • Gibbs Random Field (GRF) • natural way to model piecewise homogeneous phenomena • used in image restoration [Geman&Geman 84] • Probability Model on a lattice • Partially Relaxes independence assumption to allow interactions between neighbors tkapur@ai.mit.edu

  39. EM-MF Segmentation: EM + Gibbs Prior • We model tissue classification W as a Gibbs random field: tkapur@ai.mit.edu

  40. EM-MF Segmentation: Gibbs Prior on Classification • We model tissue classification W as a Gibbs random field: tkapur@ai.mit.edu

  41. EM-MF Segmentation: Gibbs Prior on Classification • To fully specify the Gibbs model: • define neighborhood system as a first order neighborhood system i.e. 6 closest voxels • use to define tkapur@ai.mit.edu

  42. EM-MF Segmentation: Gibbs form of Posterior • Gibbs prior and Gaussian Measurement Models lead to Gibbs form for Posterior: tkapur@ai.mit.edu

  43. EM-MF Segmentation: Gibbs form of Posterior • Gibbs prior and Gaussian Measurement Models lead to Gibbs form for Posterior: tkapur@ai.mit.edu

  44. EM-MF Segmentation • For E-Step: Need values for tkapur@ai.mit.edu

  45. EM-MF Segmentation • For E-Step: Need values for • Cannot compute directly from Gibbs form tkapur@ai.mit.edu

  46. EM-MF Segmentation • For E-Step: Need values for • Cannot compute directly from Gibbs form • Note tkapur@ai.mit.edu

  47. EM-MF Segmentation • For E-Step: Need values for • Cannot compute directly from Gibbs form • Note • Can approximate • Mean-Field Approximation to GRF tkapur@ai.mit.edu

  48. Mean-Field Approximation • Deterministic Approximation to GRF [Parisi84] • the mean/expected value of a GRF is obtained as a solution to a set of consistency equations • Update Equation is obtained using derivative of partition function with respect to the external field g. [Elfadel 93] • Used in image reconstruction [Geiger, Yuille, Girosi 91] tkapur@ai.mit.edu

  49. Mean-Field Approximation to Posterior GRF • Intuition: • denominator is normalizer • numerator captures: • effect of labels at neighbors • measurement at voxel itself tkapur@ai.mit.edu

  50. Summary of EM-MF Segmentation • Modeled piecewise homogeneity of tissue using a Gibbs prior on classification • Lead to Gibbs form for Posteriors • Posterior Probabilities in E-Step are approximated as a Mean-Field solution tkapur@ai.mit.edu

More Related