Download
slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
UNIT 16 Exercise Evaluation PowerPoint Presentation
Download Presentation
UNIT 16 Exercise Evaluation

UNIT 16 Exercise Evaluation

257 Views Download Presentation
Download Presentation

UNIT 16 Exercise Evaluation

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. UNIT 16 Exercise Evaluation

  2. Objectives • Describe the Evaluation Process • Identify objective based • evaluation concepts • Describe an evaluation team • Describe an evaluator checklist, • complete with objectives and points of review. • Identify evaluator skills, attributes and pitfalls • Describe the After Action Report process • Describe the Improvement Planning process

  3. DEFINITION OF EXERCISE EVALUATION The act of reviewing or observing and recording exercise activity or conduct, applying the behavior or activity against exercise objectives, and noting strengths, areas for improvement, deficiencies, and other observations.

  4. DEFINITION OF EXERCISE EVALUTOR EVALUATOR: An evaluator is an exercise participant who observes and documents those actions and decisions of the players that are directly related to the exercise objectives

  5. THE CASE FOR SYSTEMATIC EXERCISE EVALUATION • Exercise Evaluation must be addressed consistently and systematically in emergency management exercises. • It is the Cost - vs - Benefit factor

  6. RATIONALE FOR A SYSTEMATIC EXERCISE EVALUATION DESIGN (K. Lerner – ANL)

  7. SYSTEMATIC, OBJECTIVE-BASED EVALUATION DESIGN PROCESS • Formulation of an Evaluation Plan • Based on definition and tracking of exercise objectives • Produces written evaluations for each stated objective • Provides a basis for assessment and upgrading of the emergency management system • Includes evaluation forms that are based on the objectives & corresponds w/ the exercise design criteria • Depends on the use of qualified evaluators

  8. EXERCISE PROCESS • Establish • the foundation • Design and Development • Conduct • Evaluation • Improvement Planning • EVALUATION • PROCESS • Select • Evaluation • Team • Director • Develop • Evaluation • Plan • Select, • Organize & • Train • Evaluators • Observe & Document Player Actions • Participate in Post-Exercise Critiques & Meetings • Develop Evaluation Reports • *Track Long-Term Follow-up • * - May Not Be Evaluator Responsibility EVALUATION TASKS IN THE OVERALL EXERCISE DEVELOPMENT PROCESS

  9. PRE-EXERCISE • PHASE • EXERCISE • PHASE • POST-EXERCISE • PHASE • Select the Evaluation • Team Leader • Develop Evaluation • Plan • Select & Organize the • Evaluation Team • Conduct Team • Meetings • Training of Evaluators • Observe Assigned • Objectives • Document Actions • Coordinate • Evaluator Team • Achievement of • Exercise Objectives • are Assessed • Post-Exercise • Meetings Attended • Written Report is • Prepared THE EXERCISE EVALUATION PROCESS

  10. PRE-EXERCISE TASKS OF THE EVALUATION TEAM DIRECTOR • Select the evaluation methodology, and if necessary, develop the appropriate evaluation forms • Develop the Evaluation Plan • Select and train evaluation team members

  11. EXERCISE PHASE TASKS OF THE EVALUATION TEAM DIRECTOR DURING THE EXERCISE ENSURES EVALUATORS ARE: • In the right locations • Equipped with the proper & correct documentation & equipment • Observing & documenting player actions • Provided with backup, if needed

  12. POST-EXERCISE TASKS OF THE EVALUATION TEAM DIRECTOR • Ensures evaluators assess achievement of exercise objectives • Coordinates involvement of evaluators in post-exercise meetings • Coordinates and reviews drafting of written reports • May play a role in long-term follow-up

  13. EVALUATION METHODOLOGY DEFINITION • EVERY EXERCISE EVALUATION METHODOLOGY IS DEFINED BY: • Objectives to be demonstrated • Evaluator team size, critique process, & scheduling • Forms, checklists, and points of review for use by the evaluators • Information & logistics support including evaluator packets • After action follow-up

  14. EXERCISE OBJECTIVES Serve as the basis for evaluation of the exercise Standards & Criteria (EOP, SOP) Systematic Exercise Evaluation Exercise Objectives

  15. DISCUSSION QUESTION Where can standard lists of emergency management exercise objectives be obtained?

  16. EXERCISE OBJECTIVES • Should be demonstrable with a reasonable commitment of resources • Should be defined so that related actions occur in one location

  17. EXERCISE OBJECTIVES Should divide the exercise into discrete functions Direction & Control Fire/ Rescue Evacuation/ Shelter & Mass Care Communication Warning Alert/ Notification

  18. EXAMPLES • EMS: Medical Command will Triage, Treat transport victims within 45 min • Hazmat: Will identify the product & initiate offensive containment within 90 min • Pub Info: PIO will coordinate • interagency information • releases per protocols • Res Mgnt: CP/EOC will • coordinate resources

  19. SAMPLE OBJECTIVES

  20. TYPES OF OBSERVATIONAL DATA • DESCRIPTIVE: • Reporting everything which • is related to the assigned function • Usually reliable because it • requires little inference

  21. TYPES OF OBSERVATIONAL DATA • INFERENTIAL: • Requires the evaluator to make inferences before recording data • Harder to collect reliable information

  22. TYPES OF OBSERVATIONAL DATA • EVALUATIVE: • Requires evaluative judgments as well as inferences • Most complex and difficult to collect

  23. EVALUTOR PACKETS – FORMS • Remind the evaluator what to look for • Prompt the evaluator to gather specific information in an organized manner

  24. POINTS OF REVIEW Are used by the evaluators to determine whether or not objectives are successfully demonstrated

  25. DISCUSSION QUESTION What actions or conditions would an evaluator look for to determine if this objective has been met? “ Demonstrate communications capabilities with all appropriate emergency response locations, organizations, and personnel.”

  26. EVALUATION FORM DESIGN Keep the questions short and simple • What time did responders arrive?

  27. EVALUATION FORM DESIGN KEEP THE FORM MANAGEABLE • Try to limit to a maximum of 15-20 Points of Review • May be more extensive for complex exercises • Indicates points of review for the exercise objectives

  28. EVALUATION FORM DESIGN Keep layout and typeface (font) simple & bold Did the EMS units within 10 minutes Did the EMS units arrive at the Scene within 10 minutes of being dispatched?

  29. EVALUATION FORM DESIGN Do not ask for known information Were the participants aware of the location of the EOC?

  30. EVALUATION FORM DESIGN Do not ask questions the evaluator cannot answer If an evaluator is in the field, do not ask what time the call came into the dispatch center.

  31. EVALUATION FORM DESIGN Objective specific narrative summaries give substance to basic information on the checklist portion of the form.

  32. EVALUATION FORM DESIGN References help the evaluator in later assessment of the exercise objectives.

  33. SAMPLE EVALUATION FORM

  34. STATE OF OHIO TERRORISM EXERCISE AND EVALUATION MANUAL • Developed by the State of Ohio with Direct Support from the Office for Domestic Preparedness (ODP) • Nearly 18 months from concept to completion • Based on HSEEP Volume II • Cross-walked to HSEEP Volume II Tasks and EEGs • Utilized in-state for over 75 exercises in 2004 • Utilized for ODP Direct Support Exercises in Cincinnati (MLB) and Cleveland (UASI)

  35. EVALUATOR PITFALLS Sometimes evaluators are not objective

  36. MINIMIZING EVALUATOR FATIGUE How can evaluator fatigue be minimized?

  37. EVALUATOR EFFECT The presence of evaluators may influence the behavior of the players because they know what the evaluators are looking for.

  38. REDUCING EVALUATOR EFFECT When players know the evaluation report does not reflect unfavorably on individuals.

  39. REDUCING EVALUATOR EFFECT Evaluator Be at the appropriate place when the players arrive.

  40. EVALUATOR BIAS This type of bias refers to errors traceable to characteristics of the evaluator or the observed situation. What are some ways to avoid evaluator bias?

  41. RATING ERRORS Data can be influenced by certain readily identifiable errors.

  42. ERROR OF LENIENCY What a great job by all!!! Some evaluators will rate all actions positively.

  43. ERROR OF CENTRAL TENDENCY An individual that will describe all activities as average to avoid making any difficult decisions is committing the Error of Central Tendency.

  44. HALO EFFECT The “Halo Effect” is the tendency for an evaluator to form an early impression of an individual or an operation and permit this impression to influence his or her observations.

  45. HYPERCRITICAL EFFECT I know something’s wrong here. When an evaluator believes that it is his or her job to find something wrong regardless of the players’ performance.

  46. CONTAMINATION CONTAMINATION HAS TO DO WITH THE INFLUENCE OF THE EVALUATOR’S KNOWLEDGE OR EXPECTATIONS ABOUT CERTAIN ASPECTS OF THE EXERCISE Same animals as last year

  47. DISCUSSION QUESTIONS • If evaluators need additional information, what sources other than direct questioning might be available? • What are some reasons why it would be beneficial to allow evaluators to intervene in the exercise? • Why would you not want evaluators to intervene?

  48. EFFECTIVE QUESTIONING Intervene only when necessary What alternative questions might be asked to obtain the information sought?

  49. EFFECTIVE QUESTIONING Avoid leading questions Was the notification process completed within the appropriate timeframe?

  50. EFFECTIVE QUESTIONING Avoid prompting questions Have you begun evacuation of the affected area yet?