1 / 37

Using Standing Setting in evaluation: Exploring differences between the ordinary and excellent

Using Standing Setting in evaluation: Exploring differences between the ordinary and excellent. Janet Clinton University of Auckland, New Zealand University of Connecticut 2009. The chocolate chip cookie exercise. Chocolate chip standards of success. Aims.

toni
Download Presentation

Using Standing Setting in evaluation: Exploring differences between the ordinary and excellent

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Standing Setting in evaluation: Exploring differences between the ordinary and excellent Janet Clinton University of Auckland, New Zealand University of Connecticut 2009

  2. The chocolate chip cookie exercise

  3. Chocolate chip standards of success

  4. Aims • Explore judgment of success • Illustrate standard setting as an approach to systematically judge levels of success • Demonstrate that standard setting can be a powerful tool in program evaluation in either large or small evaluations • Standard setting particularly useful in health settings

  5. What is a standard? • A standard is a statement about whether a performance is good enough for a particular purpose Evaluation is about judgments

  6. The judges

  7. How have we done this in the past? • The use of benchmarks to understand success: • Common practice • Developed from literature reviews & input by stakeholders however • Often inaccurate or not representative • The lack of strong evidence and diverse cultures often makes the process of determining standards difficult

  8. Praying……….

  9. What is standard setting? • Standard setting is a means of systematically harnessing human judgment to define and defend a set of standards

  10. Standard setting methodology A process of rationally deriving, consistently applying, and describing procedures on which judgments can be made Describes the attributes of each category e.g., what makes a program fall into the category of excellent The process decides on a “cut-score” that then creates distinct and defensible categories e.g., pass/fail, allow/deny, excellent/poor.

  11. Theoretical basis & method • Based on theories of judgment analysis & attribution theory • Used in educational measurement e.g. Jaegar, 1997 • Methods • Angoff • Bookmark • Hofstee • Lens Modelling The method has to be: • Defensible • Credible • Supported by body of evidence in the literature • Feasible • Acceptable to all stakeholders All about the process and the right fit

  12. Standard Setting in an Evaluation of National Chronic Care Management Initiatives

  13. The project • Understand best practice of chronic care management programs --- nationally. • Develop a user friendly best practice workbook: e.g. COPD, Stroke, CVD & CHF • The evidence---Literature review • What is happening---Stock take survey • Key stakeholders--Expert interviews • Acceptable practice--Standard setting exercise • Evaluation of individual programs or sites • Development of a best practice work book

  14. Applying the methodology Conducted Stand Setting Workshops • 5 sites around NZ • Invite experts/stakeholders • Group and individual rating exercises • Analyze the assessments • Define a standard • Determine success indicators

  15. Preparation work • Understand the evidence base for success • Develop a set of dimensions • Develop profiles of programs • Plan the workshops • Determine the composition of stakeholders • Develop assessment rubrics—scoring schedule

  16. Exercise 1 • Importance of each factor • Rank the importance of the factors that make up best practice • Present the information back • Discuss as a group e.g. missing dimensions

  17. Site W

  18. Regional Comparison Regions

  19. The Exercise 2: Round 1 • Form representative groups • Describe and discuss the dimension • Read the profiles & make a judgment about the standard of the dimensions • Rate the dimensions • Each individual records the reason or explanation for their choice

  20. Analyzing the information • Analyze all the information • Present the information back to the established groups of mixed stakeholders • Individual information was returned

  21. Exercise 2: Round 2 • Information returned • Groups discuss their individual choices • Aim to get group consensus on rating best practice of dimensions • Information recorded along with the explanations

  22. Results for each profile Roles defined, attendance at meetings; program spokes person No collaboration with PHO or NGOs No plan to disseminate info to community

  23. Analysis of results • Regression to determine importance of dimensions • Create a matrix of explanations first by individuals then by group consensus • Consult the evidence & the experts • Determine the ‘cut scores’ for each assessment area—used the bookmark method

  24. Overall means

  25. Factor analysis Individuals Groups Program & Organization1 2 3 1 2 3 Community links .92 .06 -.18 .69 .06 .06 Collaboration .74 .04 .05 .90 -.00 -.07 Focus on health inequalities .72 .01 .00 .75 -.10 .17 Conceptual understanding .61 -.07 .21 .86 .06 -.08 Delivery design system .54 .05 .22 .52 .36 -.01 Engagement of leadership .40 .06 .39 .48 .24 .24 Effectiveness/Efficiency Attention to efficiency -.03 .97 -.03 .04 .91 -.02 Attention to effectiveness .10 .73 .12 .15 .79 -.03 Information Decision support systems .08 .10 .72 .30 .19 .70 Adhere to clinical guidelines .05 .21 .59 -.11 .67 .32 Knowledge transfer .14 .36 .39 .43.48 -.00 Factor inter correlations Program 1 1 Efficiency/Effectiveness .62 1 .68 1 Information .66 .68 1 .38 .51 1

  26. Across the exemplars

  27. The standard

  28. Did it work? • Analysis illustrates that we can create cut scores for basic through to excellent. • We can define the judgment • Have an explanation or criteria for each level

  29. Reactions to the workshops Variable • We have a voice • Systematic • Inclusive for roles, regions & view • Useful for self review • Hard work • We got nothing • This is dangerous • Stupid!

  30. Draw backs • Resource intense • Analysis can be problematic • Time • Pressures • Setting league tables in health • Scary

  31. Consequences • Creates a benchmark for judgment • Validates research through practice • Understand different views of success • Facilitates self-review • Encourages learning • Encourages an evaluation perspective

  32. Working with Community groups • Adapt the process • Need 8-10 in group • Great focus of the explanation • Qualitative analysis • Preparation takes community into account • Use of evidence to build rubrics & exemplars • Feasibility-Propriety- Accuracy-Utility

  33. Final word • Not so much • the METHOD as the PROCESS

  34. The judgment

More Related