1 / 6

Multiple Attribute Evaluation of Automatic Co-registration Software

Multiple Attribute Evaluation of Automatic Co-registration Software. Daniel Liu Leonid Churilov Soren Christensen Stephen Davis Geoffrey Donnan. ISC 2009. Project Objective.

barid
Download Presentation

Multiple Attribute Evaluation of Automatic Co-registration Software

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple Attribute Evaluation of Automatic Co-registration Software Daniel Liu Leonid Churilov Soren Christensen Stephen Davis Geoffrey Donnan ISC 2009

  2. Project Objective To define and compare the accuracy, usability, speed, and affordability of software available for the automated DWI/PWI mismatch analysis with the aim to utilize it for the purposes of EXTEND trial

  3. RAPID: Developed at Stanford University, USA, LINUX based software • nordicICE Penguin Stroke Perfusion Module:Nordic Neurolab, Norway, developed in Centre of Functionally Integrated Neuroscience (CFIN), Aarhus University, Denmark • Perfusion Mismatch Analyzer:Acute Stroke Imagine Standardization Group, Japan, • Neuroscape/Perfscape: Olea Medical, France

  4. Methodology – Multiple Criteria Decision Making • A structured scientific approach to handling subjective judgement • Analysis of problems with multiple criteria requires steps of (Belton & Stewart, 2002; Olson, 1996): • identifying objectives • arranging these objectives into a hierarchy and quantifying their relative importance • measuring how well available alternatives perform on each criteria • aggregating scores into a single measure using one of the multiattribute rating techniques • Objectives identification and weighting performed at problem structuring workshops with experts involving neurologists and medical physicists

  5. Criteria Tree

  6. Aggregation and Sensitivity Analysis • After measuring available alternatives on individual criteria, the resulting values are aggregated across criteria through a weighting procedure • Weights reflect the perceived importance of individual criteria and are elicited during problem structuring workshops with experts as indicated earlier • Several mathematical aggregating procedures are used for cross-validation purposes: • SMART: simple multiattribute rating technique • SMARTS: simple multiattribute rating technique with swing weighting • SMARTER: simple multiattribute rating technique exploiting ranks • AHP: analytical hierarchy processes • Sensitivity analysis with respect to weights is performed to estimate the robustness of the preferred software package

More Related