1 / 12

LESSONS TO BE LEARNED: Measurement of Unit Performance

LESSONS TO BE LEARNED: Measurement of Unit Performance. Paper to the NTSA DoD Training Transformation Technologies Meeting JAEC ADVISORY GROUP Jack Hiller, Chief Scientist, Mission Systems, I&TSD, Northrop Grumman Corp. September 5, 2003. Background. Useful References:

elda
Download Presentation

LESSONS TO BE LEARNED: Measurement of Unit Performance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LESSONS TO BE LEARNED: Measurement of Unit Performance Paper to the NTSA DoD Training Transformation Technologies Meeting JAEC ADVISORY GROUP Jack Hiller, Chief Scientist, Mission Systems, I&TSD, Northrop Grumman Corp. September 5, 2003

  2. Background Useful References: 1. Determinants of Effective Unit Performance, ARI, 1994 2. Assessing and Measuring Training Performance, 2000, ARI Technical Report 1116

  3. Evaluation vs. Assessment • Evaluation: • Standards or criteria permitting objective performance scoring. • Mission or task outcomes that are observable. • Assessment: • Rating criteria indefinite/fuzzy. • Too many variables and chance factors.

  4. Evaluation vs. Assessment (Cont.) Increasing Role for Assessment Approaches • FCS Units of Action may command far-ranging support, similar to brigade and higher. • Terrorist options for inflicting damage are virtually infinite, so predefined mission task standards for decision making might lack relevance. • Joint operations command decisions are far ranging, encompassing complex multiple dimensions: Diplomatic, Information, Military, Economic (DIME)

  5. Lessons To Be Learned: The Primacy of Defined Purpose Examples for problems in Measurement Purpose: 1. “We will collect all the data and get it to you.” NTC Commander 2. “We are building a ten thousand item database which will answer all questions about the determinants of unit effectiveness.”

  6. Lessons To Be Learned: The Primacy of Defined Purpose (Cont.) 3. Data collected in BOS categories to support NTC AARs were problematic for use in: a. Take Home Packages. b. Systematically analyzing for Lessons Learned in DOTLMS. c. Systematically analyzing for trends in DOTLMS.

  7. Unknown Measurement Reliability, and Thus Uncertain Validity • Four developers of infantry battle drills were asked to independently rate the performances of a number of trained squads. 1. Infantry LTC scored performances uniformly NOGO. 2. Platoon SGT rated performances uniformly GO.

  8. Unknown Measurement Reliability, and Thus Uncertain Validity, (Cont.) 3. PhD researcher rated about half GO and NOGO. 4. Highly experienced infantry researcher refused to rate, because conditions did not follow prescribed directions. • The evaluation test turned into an assessment exercise. • Be wary of scores from untrained raters.

  9. Scope of the Measurement Domain – Implications for the JNTC • The domain is large, with 4 DIME dimensions crossing the 7 DOMTLPF categories forming a matrix with 28 cells, with many relevant for rating. • Automation technology should be enlisted to support measurement personnel, O/Cs.

  10. Scope of the Measurement Domain – Implications for the JNTC (Cont.) PDA wireless computers can facilitate: • Prompting what is to be observed against defined performance standards. • Recording ratings and comments. • Recording the audio-visual, time tagged context. • Rapid, easy retrieval of data for analysis and for preparation of AARs. • Cumulative storage of data to support lessons learned and performance trends analysis.

  11. An Approach for Measuring Adaptable and Creative Performance • Provide rating directions and • Calibration training for assessment personnel? Premature. Instead: • Provide a three or five point rating scale for creativity. • Direct assessors to explain their rating. • After experience has been accumulated, it may be possible to revisit 1 and 2. • Initial assessors need to be acknowledged experts.

  12. Conclusions • Establish the multiple purposes for measurement. • Map each articulated purpose to the measures to be collected for it. • Establish a mechanism to implement the lessons to be learned.

More Related