1 / 29

Design and Evaluation of Health Care Information Systems

Design and Evaluation of Health Care Information Systems. Focus on user involvement. TDT4210 Helseinformatikk, 2 Nov. 2005, Gry Seland. Outline. Importance of user involvement

gita
Download Presentation

Design and Evaluation of Health Care Information Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Design and Evaluation of Health Care Information Systems Focus on user involvement TDT4210 Helseinformatikk, 2 Nov. 2005, Gry Seland

  2. Outline • Importance of user involvement Moen, A. (2003). A nursing perspective to design and implementation of electronic patient record systems. Journal of Biomedical Informatics, 36(4-5), 375-378. 2. Methods for user involvement Sørby, I. D., Melby, L., & Seland, G. (2005). Using scenarios and drama improvisation for identifying and analysing requirements for mobile electronic patient records. In J. L. Maté & A. Silva (Eds.), Requirement engineering for socio-technical systems. Hersley: Information Science Publishing. 3. Evaluation Ammenwerth, E., Brender, J., Nykanen, P., Prokosch, H.-U., Rigby, M., & Talmon, J. (2004). Visions and strategies to improve evaluation of health information systems: Reflections and lessons based on the his-eval workshop in innsbruck. International Journal of Medical Informatics, 73(6), 479.Examples of systems? Reference to standard for Human-centred design ISO13407. (1999). Iso 13407 human-centred design processes for interactive systems. London: British Standards Institution.

  3. Why involve users? • Complex clinical work situation • Complex system • System must support clinical practice Design and evaluation of EPR systems

  4. Challenges related to organizational factors • User behaviour, education and training • Legal and social issues • Cost-benefit of system? • Leadership • Enable development of new practice Design and evaluation of EPR systems

  5. Anne Moen: Nursing perspective on EPR design and implementation • Technology, organization, informatics, nursing practice and nursing leadership Design and evaluation of EPR systems

  6. Important questions • How can EPR support nurses/clinicians? • How can EPR lead to improved care quality and patient safety? • How can EPR contribute to building clinical knowledge? • Can EPR support collaboration between different health care professional groups? • Can EPR support administration and research? Design and evaluation of EPR systems

  7. Examination of nursing/clinical practice • Characteristic of nurses’ work • Characteristic of context of nurses’ work ISO 13407 Standard Human-centred design processes for interactive systems Design and evaluation of EPR systems

  8. Examination of nursing/clinical practice • Characteristic of nurses’ work • Characteristic of context of nurses’ work ISO 13407 Standard Human-centred design processes for interactive systems Design and evaluation of EPR systems

  9. Nursing/clinical leadership • Responsible for developing nursing practice • Important with early and ongoing involvement in EPR project • Time allocation: Signify the importance of EPR Design and evaluation of EPR systems

  10. Methods to understand and involve users of EPR system • Interviews and discussions • Observation studies • Role play • Usability testing Methods

  11. Interviews and group discussions What do you need? • From formal to informal discussions • Useful information about practice and needs • Memory constraints • Demand characteristics: Tell what is expected • Don’t know about technological solutions • Not everything is easy to articulate (tacit knowledge) • What about mobility? Methods

  12. Observation studies • Video, present in the same room, shadowing • Observe things people don’t talk about • Must wait with questions, cannot interrupt • Difficult to transform observations to design suggestions Methods

  13. Role play • Some times it is easier to show than to explain • A role play can be “frozen” and replayed • Design of mobile technology needs methods that make visible the mobility of the situation Methods

  14. Role play and prototyping • Technology change practice and vice versa Technology Task analysis, Use case (RUP) Interviews Role play and low-fi prototyping Future Observation studies Future workshops, BPR Present Practice Present Future Methods

  15. Role play workshop • Part one: Focus on today’s situation • What is today’s situation? • What is concidered problematic today? • Part two: Tomorrow’s situation • How can technology be helpful in the future? • What cannot technology do? Methods

  16. Design in action • Act out a scenario until someone identifies an information need • ”Freeze the scenario” • Choose a prototyping model. Sketch the functionality on paper. • Continue acting Methods

  17. Design in action • Act out a scenario until someone identifies an information need • ”Freeze the scenario” • Choose a prototyping model. Sketch the functionality on paper. • Continue acting Methods

  18. Usability testing • Early (paper prototypes) and last step between before implementation • Identifying ”breakdowns” • Usability laboratory at NSEP available for master students Methods

  19. Evaluation (Ammenwerth et al.) Measuring or exploring properties of the health information systems to inform decisions to be made concerning that system in a specific context. • Summative • Formative (during system development) Evaluation

  20. Why evaluate • Assess quality, value, effect and impact of IT in health care • Improve health information applications • Enable evidence-based health informatics profession and practise • Development and implementation is expensive  learn from experience

  21. Evaluation questions • Usability (specific users and specific context) • User attitudes • Cost-effectiveness • Organizational and social consequences

  22. Barriers to evaluation • Evaluation methods not adapted to health care context • Lack of support for formative evaluation • Evaluation studies do not always answer important questions • Limited value of evaluation reports to others • Evaluation is transdiciplinary, and all academic fields (medical informatics, psychology, health economics etc.) have their own methodologies

  23. HIS-EVAL 2003 • Bring together experts from different fields • Problems and barrieres to evaluation • Visions and strategies with regard to evaluation of helth information systems • Long-term and short-term strategies to reach the goals in 2

  24. Problems and barriers • Awareness: ”Evaluation is too academic, and have no relevance for developers, decision-makers, users and politicians” • Methodological issues: Not chosing the right methods, poor application • Practical issues: Conflicting interests • Dissemination: Evaluation results are not published

  25. Visions and strategies • Awareness: Measurement of success and non-success is an integral part of IS design • Methodology: Methods chosen because of evaluation question, research on methods are valid research themes • Practical: Sufficient funding • Dissemination: Reporting studies for different audiences

  26. Implementation of activities • Evaluation portal • Good evaluation practice and reporting • Networks • Awareness • Educate the evaluator

  27. Conclusion • User involvement is important because Health Care Organizationss are complex, and health care work is complex • ISO 13407: Important to understand clinical work and context of use • Several methods available for user involvement, including interviews, discussions, observation studies and role play • Evaluation is important, and should be formative, not only summative

  28. References Importance of user involvement Moen, A. (2003). A nursing perspective to design and implementation of electronic patient record systems. Journal of Biomedical Informatics, 36(4-5), 375-378. Methods for user involvement Sørby, I. D., Melby, L., & Seland, G. (2005). Using scenarios and drama improvisation for identifying and analysing requirements for mobile electronic patient records. In J. L. Maté & A. Silva (Eds.), Requirement engineering for socio-technical systems. Hersley: Information Science Publishing. Evaluation Ammenwerth, E., Brender, J., Nykanen, P., Prokosch, H.-U., Rigby, M., & Talmon, J. (2004). Visions and strategies to improve evaluation of health information systems: Reflections and lessons based on the his-eval workshop in innsbruck. International Journal of Medical Informatics, 73(6), 479.Examples of systems? Reference to standard for Human-centred design ISO13407. (1999). Iso 13407 human-centred design processes for interactive systems. London: British Standards Institution.

More Related