1 / 14

Welcome to Today’s ECLIPS Webinar

Welcome to Today’s ECLIPS Webinar. Check-in. What’s happened since our last gathering that you’d like to share with the group (either personal or professional)?. ECLIPS Webinar February 17, 2012 How does systems thinking inform evaluation purposes and evaluation roles?.

oceana
Download Presentation

Welcome to Today’s ECLIPS Webinar

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome to Today’s ECLIPS Webinar

  2. Check-in • What’s happened since our last gathering that you’d like to share with the group (either personal or professional)?

  3. ECLIPS WebinarFebruary 17, 2012How does systems thinking inform evaluation purposes and evaluation roles?

  4. Overview of Today’s Webinar • Tarek introduces his work (5 minutes) • System concepts’ link to evaluation purposes/evaluator roles Marah, Karen, and Ginger (50 minutes) • Possible AEA proposals; ECLIPS meeting October 28 (10 minutes) • Preparation for annual report. (10 minutes) • .

  5. Question 1 Here are five purposes for/types of evaluation. Please identify the purpose(s)/type(s) of evaluation you are using in your ECLIPS STEM evaluation? (Check all that apply.) Response Percent Response Count Summative evaluation: 100.0% 6 Formative evaluation: 100.0% 6 Accountability Monitoring: 2 33.3% Developmental evaluation: 2 33.3% Knowledge generation: 4 66.7%

  6. Question 3 Which of these roles do you play in the STEM evaluation you are focused on for the ECLIPS work? (Check all that apply.) Response Percent Response Count Help users clarify their purpose, hoped-for results 66.7% 4 Help program designers/users conceptualize theory of change 50.0% 3 Offer conceptual and methodological options for program design/implementation 50.0% 3 Question assumptions 50.0% 3 Gather data from stakeholders 83.3% 5 Analyze data 83.3% 5 Provide data interpretations 100.0% 6 Make recommendations based on data 100.0% 6 Facilitate interpretative dialogue 50.0% 3 Other roles (please describe) 2 33.3%

  7. Question 5 Which of these roles do you play in other evaluations you are doing or have proposed to conduct? (Check all that apply.) Response Percent Response Count Help users clarify their purpose, hoped-for results 83.3% 5 Help program designers/users conceptualize theory of change 50.0% 3 Offer conceptual and methodological options for program design/implementation 83.3% 5 Question assumptions 66.7% 4 Gather data from stakeholders 100.0% 6 Analyze data 100.0% 6 Provide data interpretations 100.0% 6 Make recommendations based on data 100.0% 6 Facilitate interpretative dialogue 83.3% 5 Other roles (please describe) 1 16.6%

  8. Comparison of Evaluator Roles in ECLIPS and Other Evaluations (Q3 and Q5)

  9. Topics for Discussion • Discuss evaluation purpose(s) and the role(s) you play in your ECLIPS STEM evaluation. • Which roles are most comfortable for you and why? • Are there changes you would make if you could? Explain. • Has the information that we’ve discussed to this point shifted your thinking about the purpose of the evaluation? About your role? Explain.

  10. Question 6 Which of the following evaluator roles do you most enjoy? (Select up to three.) Response Percent Response Count Help users clarify their purpose, hoped-for results 33.3% 2 Help program designers/users conceptualize theory of change 33.3% 2 Offer conceptual and methodological options for program design/implementation 3 50.0% Question assumptions 2 33.3% Gather data from stakeholders 16.7% 1 Analyze data 33.3% 2 Provide data interpretations 16.7% 1 Make recommendations based on data 50.0% 3 Facilitate interpretative dialogue 66.7% 4 Other roles (please describe) 0 0.0%

  11. March Webinar • Transitioning an existing evaluation (e.g., design established in funded proposal, underway for several years) into a systems-based one • Initial focus on complex adaptive systems (self-organizing systems) and developmental evaluation

  12. Evaluation Definitions • Summative evaluation: to make a judgment of overall merit, worth, value, and significance of the program and model to inform and support major decision making; and/or to determine the future of the program and model, including especially whether it should be disseminated as an exemplar and taken to scale • Formative evaluation: to improve the program, fine-tune the model, clarify key elements and linkages between inputs and outputs, outcomes, and impact; work out bugs in implementation; fix problems; determine efficacy and effectiveness at a pilot level to establish readiness for summative evaluation; stabilize/standardize a model to get ready for summative evaluation; test and validate instruments/procedures for summative evaluation

  13. Evaluation Definitions (cont.) • Accountability/Monitoring: to demonstrate that resources are well-managed and efficiently attain desired results; manage the program; routine reporting of activities; early detection of problems • Developmental evaluation: to help social innovators explore possibilities for addressing major 2 of 7 problems/needs/issues; develop promising innovations; support adaptation in complex uncertain, dynamic conditions; determine if an innovation is ready for formative evaluation as a pilot intervention • Knowledge generation: to enhance general understandings and identify generic principles about the work being evaluated

More Related