1 / 45

Lessons for Health Emergency Management

Explore lessons identified in recent disasters and health emergencies, discuss evaluation principles, identify areas for improvement in health emergency management, and prioritize actions for development.

ewulf
Download Presentation

Lessons for Health Emergency Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. From Evaluation to Practice: Lessons for Health Emergency Management Additional module

  2. Learning Objectives • Discuss lessons identified in recent disasters and health emergencies • Explore the basic principle in evaluation • Identify aspects of health emergency management which need improvement • Identify areas for action in health emergency management development

  3. Lessons Identified or Lessons Learned “the only lesson we learn from disasters is that we don’t learn the lessons from disasters”

  4. Questions From your own experience, what lessons did you identify from disasters in terms of management? What are the lessons identified from international experiences concerning public health emergency management? Why do we, and how can we do evaluations effectively? Considering these questions and the role of the health emergency manager, what would you see as top-priorities for action?

  5. Outline – an Approach to Evaluation Small group discussions • Lessons identified: personal views Dialogue session • Lessons identified: expert views About evaluations • A pragmatic approach to evaluation Plenary discussion • Learning priorities for capacity development: group view

  6. Activity • Each group has a different focus: use the guidelines • Based on personal experience what lessons did you identify in terms of health emergency management? • Compare experiences & build consensus on key lessons identified • List on a flipchart your group results

  7. Synopsis of Evaluations on the Health Response to DisastersKey Lessons Identified for Health Emergency Management

  8. Sources Guisaugon Landslides (February 2006) ULTRA Stampede (February 2006) Guimaras Oil Spill (August 2006) Typhoon Reming (December 2006) Other disasters

  9. Lessons Identified in Context of PHEM Risk assessment Risk reduction Preparedness Response Recovery & Reconstruction Health information systems Risk communication

  10. Risk Assessment Vulnerability indicators should include: • Root causes • Socio-economic causes • Onsite threats Thailand • Locals…migrants…tourists…& industries • Ability to recover?

  11. Risk Reduction: Early Warning Systems Capacity & coordination • Capacity building is needed • Infrastructure & personnel • Coordination structures need to be developed I.e. All affected countries • No tsunami warning systems in place • Inefficient inter-sector communication in most affected countries

  12. Risk Reduction Measures Integration of prevention and preparedness into the recovery process must be advocated • Prevention measures are more cost-effective than damage response Indonesia, Sri-Lanka, India (Andaman), Thailand • Costal buffer zones • Tourism, business • Housing quality • Migrants • Health care facilities

  13. Policy: Emergency Response Policy without enforcement was powerless I.e. Thailand & Sri-Lanka • Operational problems for NGOs • Center for Non-Government Sector

  14. Preparedness: Quality of Response Disaster preparedness is crucial I.e. Indonesia –Thailand • The quality of response was affected by the degree of preparedness

  15. Preparedness: Coordination-Communication Lack of effective coordination • Internal-external • Policy-practice • Sectors-agencies I.e. All affected countries • Reporting formats • Chain of command • Media

  16. Preparedness: Resource Management Common problems • Personnel, finance, partners & logistics I.e. Countries / international aid • Alien consultants, flash appeals, role of army, leadership in health

  17. Response: Standing Operating Procedures (SOPs) SOPs were not existing or out of date, or not used I.e. All affected countries • Expressed need to develop or review SOPs

  18. Response: Logistics Logistics were troublesome • Procurement, delivery, maintenance I.e. Most affected countries • Bureaucratic routine • Customs • Expired drug supplies • Inappropriate technology

  19. Response: Health Assessments Need assessments were dysfunctional • Over assessed victims • Under informed decision makers I.e. Indonesia • Multi sectors & agencies conduct own assessments • Stakeholders did not share enough

  20. Response and Recovery: Coordination International-national & central-local relationships are crucial Risk reduction, preparedness & response affect health system functioning

  21. Recovery: Closing the Cycle Response + recovery + risk reduction = development I.e. WHO Sri-Lanka • The use of flash appeal funds to support long-term development – Opportunities for change

  22. Health Information Systems: Diseases Relevance? • Disease oriented • Data vs. decisions I.e. All affected countries • Too much CD focused • Dysfunctional • Mainly serving central levels

  23. Risk Communication: Media Relations Working with media needs attention I.e. All affected countries • First reports came from the media • Soundness of information… • Relationships…

  24. Principle of EvaluationAn introduction

  25. How can we Define Evaluation? (1) The classic perspective: • Concerned with the achievement of objectives The broad perspective: • Achievement of objectives is a key, but it is only part of what an evaluation might be concerned with • Unplanned and unexpected outcomes or processes might be very important and would not be looked for if evaluation were limited to objectives

  26. How can we Define Evaluation? (2) Key elements in defining evaluation are: • The need for systematic collection of information • The wide range of topics to which evaluation can be applied • To be effective, the evaluation results has to be used by someone • The wide variety of purposes of evaluations

  27. Is There a Difference Between Research & Evaluation? • Research & evaluations use the same toolbox (methodologies) • However, for a different purpose • Research aims to prove… • Evaluation aims to improve…

  28. Why do we do Evaluations? It’s all about interventions or programs to: • Inform planning • Define progress • Examine efficiency • Examine effectiveness or achievement • Inform decision-making

  29. How could you find out the Purpose of an Evaluation? • Who asked for evaluation? • Who pays for the evaluation? • Why do these people want an evaluation? • What are the decisions that need to be made? • What information is required to facilitate decision-making? • Who is going to be affected by evaluation outcomes?

  30. Classifying Evaluation Purposes Needs assessments • Before action / response to understand context & needs • Examples: risk assessment, capacity assessment, damage assessment, health assessment etc. Program monitoring • Compliance with policy / plan • Validity of assumptions & pre-conditions Formative evaluation • Efficiency Summative evaluation • Effectiveness • Decision-making on continuation, expansion, reduction, closure, funding

  31. Debriefs-hotwash-review What: A during-action (operations) or post-action (i.e. exercise / response) reflection Who: Individuals, service provider groups and or agencies involved Why: Highlight lessons identified and take action How: Identify areas for improvement in procedures, equipment and systems What not: ≠ a forum for criticizing the performance of others operational debriefs ≠ trauma debriefs

  32. Quick and dirty evaluations Weighty evaluations Guess evaluations Personality focused evaluations Eyewash evaluations Whitewash evaluations Submarine evaluations Posture evaluations Postponement evaluations Bias and Politics of Evaluations

  33. Evaluation Logic

  34. What Methods can you Think of to Collect Information? Quantitative • Documentation research • Survey interviews • Clinical surveys Qualitative • Documentation research • Interviews • Focus Groups • Observations

  35. Quantitative Data Sources & Methods Documented information (e.g. medical records / vital statistics on population samples) Individual data on samples of a subgroup (e.g. clinical conditions) Indirect measures Observation Survey interviews Individual data from samples of a community or subgroup (e.g. demographics, disease / injury histories etc)

  36. Qualitative Data Sources & Methods Behaviour Community perspective normative view Observation Focus groups In-depth interviews Indirect measures Individual interviews Individual perspective intra-cultural variation Key informants In-depth knowledge Documented information

  37. About Methods • Questions will tell you what information you can obtain • Information needed will tell you what methods could be used • Local context & resource limits will tell you what methods are feasible • Keep it as simple as possible

  38. In Summary To find out something useful about an intervention or program, use whatever you have in your toolbox that will get the job done…

  39. Activity: Plenary At the end of the day…considering the role of HEM in disasters: What would you, as a group, see as top-priorities for action?

  40. Putting it Together-1 Build capacity in risk management & vulnerability reduction • Policy & legislation development • Develop lines of authority & control • Allocate resources for risk mgt & vulnerability reduction Give attention to health assessments • Relevant to decision-making • Multi-sectoral • Population-based

  41. Putting it Together-2 Develop benchmarks & standards of practice • Vulnerability indicators • Surveillance systems • Relevant health assessments • Indicators on effectiveness, efficiency, costs, & benefits of both preparedness & response Improve coordination of responses • Internal • External assistance • Policy-practice

  42. Putting it Together-3 Develop logistics systems • Legislation • Relevance • Capacity building Foster contribution of other sectors • Civil-military liaisons • Public-private sector liaisons • NGOs’ inclusion not marginalization

  43. Putting it Together-4 Develop risk communication • IEC prior and during disasters • Combat disaster myths • Guidance on media relations Developing capacity • Health emergency management leadership • PHEM capacity at all levels and functions • Networking & partnerships • Knowledge development

  44. From Evaluation to Practice: Key to Developing Health Emergency Management Strengthening Development Evaluation Research Improved Health Outcomes Awareness Advocacy Capacity Building

  45. THANK YOU

More Related