1 / 29

Evaluating health informatics projects

This text discusses the reasons for evaluating health informatics projects, the problems faced during evaluation, and the objective and subjective models of evaluation. It explores the perspectives of stakeholders and the effects of evaluation on the structure, processes, and outcomes of healthcare. The complexity of evaluating the combination of medicine, health care, and information systems is highlighted, along with the challenges of evaluating the impact of information systems on patient care. Various evaluation methodologies and study features are also discussed, including measurement studies, demonstration studies, descriptive studies, comparative studies, and correlational studies.

Download Presentation

Evaluating health informatics projects

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating health informatics projects Reasons for and problems of evaluation Objective model Subjective model

  2. Definitions • Evaluate – to determine the value of (Chambers) • To examine and judge carefully (Dictionary.com)

  3. Reasons for evaluation(Friedman/ Wyatt) • Promotional – • encouraging people to use systems • Scholarly – • study of the impact etc. of HI systems • Pragmatic (practical) – • finding out what is good and bad, improving future systems • Ethical – • like any medical intervention, safe and effective • Legal – • same reason. Also to inform users so they know when and when not to use it

  4. Perspectives • Stakeholders • Developers • Users • Patients • Managers • Sources of funding

  5. Effects • Structure – environment, staff, money • Processes – diagnosis, investigation, treatments • Outcomes – success of treatment, survival, continuing health

  6. Complexity • Combination of • Medicine & health care • Information systems / IT • Evaluation methodology • Each of these is a huge area • Arguably IT is the simplest, or at least the most structured

  7. Of Medicine • Extremely large and growing area of knowledge • Complex structure • Equipment, staff, regulation • Processes • Treatments etc. • Outcomes • Long term, difficult to measure • Knock-on effects of innovation • Effect of IT particularly hard to measure

  8. Of Information systems • Difficult to fully test • Combinatorial explosion • Multi-function • Has a range of effects • System itself vs. impact on health care

  9. Of Evaluation • Have to measure impact • This means impact on people - difficult to study • Need patients and staff to perform tests • May not be enough willing to cooperate • Range of things that can be evaluated, ranging from • ‘Does it work?’ to • ‘Does it help patients?’

  10. Evaluation • In theory, • study situation before & after • In practice, • don’t know what changes would have occurred without innovation • don’t know what interesting questions will arise during study

  11. Tips • Tailor study to problem • Not research – specific to this project • Collect useful data • Data which inform final decision • Look for side-effects • Effects not related to intended purpose • Formative & summative • Study during & after development

  12. Tips (continued) • In vitro vs. in vivo • Evaluate on-site & off-site • Don’t accept developer’s view • Take account of environment – context • Let questions appear during study • Be prepared to use a range of methods

  13. What can be studied • Need for resource • What does it give us that we didn’t have before • Development process • What methods do developers use to design their solution? • Structure of resource • What does the program & spec look like? • Functions of resource • How well does it work? • Impact • How does it affect HCPs and patients?

  14. Study features • Focus • As previous slide • Setting • Laboratory or hospital • Data • Real or simulated • Users • Developers, evaluators, end-users • Decisions • None, simulated or real

  15. Types of study • Need validation • Design validation • Structure validation • Laboratory function • Field function • Laboratory user impact • Field user impact • Clinical impact

  16. Objectivist or quantitative approach • Can measure things objectively and without affecting thing being measured • What to measure can be agreed rationally • Can use numerical data • Draw definite conclusions

  17. Objectivist approaches • Comparison-based • Like randomised clinical trial • Objectives-based • Does it do what the designers said? • Decision facilitation • Answers questions posed by managers • Goal-free approach • Evaluators not aware of project goals

  18. Methods • Measurement • Demonstration studies • Descriptive • Comparative • Correlational • Statistical analysis

  19. Measurement studies • Terminology for measurements • Object e.g. patient • Object class e.g. patient group • Attribute e.g. temperature • Instrument e.g. thermometer • Observation e.g. temperature at one time • Validation – calibration of thermometer

  20. Demonstration studies • Demonstrate effect • ‘Do patients who have been inoculated have a higher temperature?’ • Object -> subject (patient) • Attribute -> variable (temperature)

  21. Descriptive • ‘The patients in this study have a rather high temperature’. • Mean, standard deviation etc.

  22. Comparative • ‘The patients in this study have a higher temperature than a control group’ • Controlled environment (usually) • T-test etc.

  23. Correlational • ‘We are seeing more patients with fever since we introduced inoculation’ • Live situation • Could still be a t-test • Trying to associate one factor with another in a real situation

  24. Subjectivist or qualitative • Observations depend on observer • Observations only meaningful in context • Different points of view may be valid • Descriptions as valuable as numbers • Discussion of results

  25. Subjectivist approaches • Quasi-legal • Cf ethical debate • Art criticism • Expert review • Professional review • Site visit • Responsive/illuminative • Immersion in environment • Questions evolve over time

  26. Qualitative approach • Attempts to understand why as well as measure differences e.g. • Is system working as intended? • How can it be improved? • Does it make a difference • Are differences beneficial? • Are the effects those expected?

  27. Stages in qualitative study • Negotiation of ground rules • Immersion into environment • Initial data collection to focus questions • Iteration • Report and feedback • Final report

  28. Methods in qualitative study • Observation • Interviews • Document analysis • Others, e.g. structured questionnaires

  29. Mixed study • Can combine qualitative and quantitative approaches

More Related