1 / 54

Evaluation and Measurement

Evaluation and Measurement. Spring 2010 Quality Training Certificate Program. Quality Training Certificate Program. Certificate Details:

ling
Download Presentation

Evaluation and Measurement

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation and Measurement Spring 2010 Quality Training Certificate Program

  2. Quality Training Certificate Program Certificate Details: Participants have the option to attend individual classes, or to participate in 6 classes and obtain a certificate in quality training.  Classes will be held twice yearly for a total of three years to complete the certificate.  Classes will continue to cycle, so new participants will be able to complete the certificate no matter when they begin.  Classes can also be taken at any time as stand-alone experiences.  There will be no charge for attending the classes, but supervisor permission must be given for attendance during regular work hours.  Attendees from previously held classes will be given credit toward their certificate for attending.  Classes will last 3 hours each, for a total of 18 total contact hours. Attendance credit will continue to be given for individual classes taken. When attendees complete six classes, they will fill out an application for certification, and send it to BPA, who will verify the information and award the certificate.

  3. Today’s Power Point and Handouts Will be available online: http://www3.uwstout.edu/bpa/training.cfm

  4. Overview • Introductions • Goals for Today • Opportunity to suggest additions to today’s agenda • Course Content, including: • Practice Designing an Evaluation • Discussion Questions • Q&A • Breaks as needed

  5. Introductions • Tell us: • Who you are • What worries you most about evaluation and measurement?

  6. Goals for Today • To understand when evaluation can be used • To understand the basic questions that guide the evaluation process • To identify appropriate evaluation methodologies • To implement basic evaluation methodologies • To utilize evaluation results

  7. Usage of Today’s Curriculum • To apply the skills that you learn today in your everyday practice • Within the next month, to practice at least one thing that you learned today

  8. Questions/Topics you Want Covered • Write down anything you want to be sure we cover on the notecards in front of you • I will collect these when we take our first break

  9. To understand when eval. can be used • What can be evaluated: • 3 P’s: • Programs • Processes • Products • Evaluations can be “formal” or “informal” and time consuming or simple. • All faculty and staff typically engage in some sort of evaluation in their everyday work – although not all apply it in a formal way.

  10. To understand when eval. can be used Two major purposes: • To answer questions about changes that need to be made to programs/processes/products (formative), • To answer questions about continuation/discontinuation of programs/processes/products (summative) Most of the evaluations we do at UW-Stout are formative.

  11. To understand when eval. can be used Examples: • Most grants require an evaluation section • The surveys that you fill out at the end of a training session are a form of evaluation • UW-Stout has a formal evaluation plan for our e-Scholar program, first year experience program, customized instruction program, etc.. • Evaluation can be used to determine if you are meeting customer needs

  12. To understand when eval. can be used Discussion question: • What other examples do you have of when evaluation can or has been used? • What was your experience with these evaluations?

  13. To understand the basic questions that guide the evaluation process Two basic questions: • What is the purpose of the process/product/ program? • How will you use the results of the evaluation?

  14. To understand the basic questions that guide the evaluation process Considerations: • It is sometimes hard to identify the goals and usage of the results, but you cannot develop a good evaluation without this information. Other ways to ask the question: • What does success look like? • If this program/process/product did was wildly successful, describe what that would look like. • Do not proceed with the evaluation if you cannot articulate how you will use the results.

  15. To understand the basic questions that guide the evaluation process Discussion Question: • How often have you been asked to participate in a survey, focus group, or other data collection method when it was not clear how the results would be used?

  16. To understand the basic questions that guide the evaluation process • All evaluation methods, data, and analysis should tie back to the goals and how the data will be used • A matrix can facilitate this:

  17. Example evaluation plan

  18. To understand the basic questions that guide the evaluation process Practice: • Think of an evaluation that you have done, or that you would like to do, or that you anticipate you will be involved with. • If you don’t have one, use one of the examples on the flip chart. • Fill in: • Evaluation question • Goals of the process/product or program • How the results of the evaluation will be used

  19. To understand the basic questions that guide the evaluation process • Once you have identified your goals and how the results can be used, you need to think about how you will know if the goal is achieved. • At this point, do not think about methods. Example: • Goal: To identify appropriate evaluation methodologies • How achieved: Level of knowledge participants have of evaluation methodologies

  20. To understand the basic questions that guide the evaluation process Discussion question: • Who has a goal from their sheet that they are comfortable sharing? • How will we know if this goal is achieved?

  21. To understand the basic questions that guide the evaluation process Practice: • Fill in the “how to know if goal is achieved” column on your sheet

  22. To identify appropriate evaluation methodologies Considerations: • Don’t let methods constrain initial discussion – start with ideas. • People like to challenge methodologies if they don’t like the results. • Too much data is a common problem. All data must tie back to goals. • It often takes more time to collect, manage and analyze data than you think it will. • Utilize multiple methods. • Available time and resources will often be the driving factors for selecting methods. Also your audience.

  23. To identify appropriate evaluation methodologies Major evaluation methods: • Existing data • Surveys • Focus Groups/Interviews • Inventories/Usage data/Database creation

  24. To identify appropriate evaluation methodologies Discussion questions: • What do you think we see most often at Stout? • What do you think is the most underutilized method at Stout?

  25. To identify appropriate evaluation methodologies Existing data You want to use existing data wherever possible. • Pros: • Minimal time commitment • Readily available • Ensures data collected is used for multiple purposes • Cons: • May be limited on what is available

  26. To identify appropriate evaluation methodologies Surveys • Use surveys when: • The information you need to obtain is not available from existing data, and cannot easily be obtained through inventories or databases • You don’t need detailed qualitative data • You have enough time to do it right

  27. To identify appropriate evaluation methodologies Surveys • Pros: • Can get at factors not easily obtainable from other sources – for example, motivation, attitudes • Typically does not require a large time commitment from participants • Best for likert-scale, checklist short answer-type questions • Cons: • Relies on self-report • Not good if you want in-depth, qualitative comments • Language, response rates, etc… often challenged

  28. Purposeof a focus group: “The purpose of a focus group is to listen and gather information. It is a way to better understand how people feel or think about an issue, product, or service. Participants are selected because they have certain characteristics in common that relate to the topic of the focus group” Krueger, Richard, and Casey. Focus Groups. 3rd ed. Thousand Oaks, USA: Sage Publications, 2000.

  29. What a focus group is • A carefully planned series of discussions designed to obtain perceptions on a defined area of interest • Discussions in a permissive, nonthreatening, environment • Groups of 5-12 facilitated by a skilled facilitator • Repeated groups – need to hold at least 3-4 sessions

  30. When to use focus groups: • Looking for a range of ideas or perspectives on an issue • Trying to determine differences in perspectives between groups of people • “A group possesses the capacity to become more than the sum of its parts” (Krueger and Casey, 2000) • Researcher needs information to aid existing quantitative data

  31. When NOT to use focus groups: • Want people to come to a consensus • Want to educate people • Don’t intend to use results • Other methodologies can produce better results • Other methodologies can produce same results more efficiently • Don’t have enough time to follow standard focus group procedures

  32. To identify appropriate evaluation methodologies Focus Group/Interviews • Pros • Allows you to go in-depth in an issue • Allows for a range of ideas – and you hear from people in their own voice • Cons • Most time-consuming approach

  33. To identify appropriate evaluation methodologies Inventories/Usage data/Observations • Pros: • Typically more accurate than self-report data • Good for tracking participation in events, trainings, programs, etc.. • Cons: • Only works for things you can count or observe • Often have to rely on other people to obtain the data • When multiple people are involved, need to make sure that everything is counted in the same way

  34. To identify appropriate evaluation methodologies Practice: • Circle the evaluation methods that you feel can be utilized to obtain the information in the second column of your sheet. Circle multiple methods if you can use more than one method.

  35. Tying it all together • You have two more columns on your sheet – a target and target date. • It is important that you identify specific targets and dates before you collect any data – as after you collect the data, people will use it to prove their point. • Often you cannot identify a target until you have your evaluation methods more fully developed – so we will not be filling in these columns today.

  36. Tying it all together • What you have on your sheets now is typically your ideal scenario for how you can go about evaluating your process/product/program. • Usually, you don’t have time to do it all, so next you need to determine what is feasible given your time, resources, and who will be reviewing the results.

  37. Tying it all together Start by sharing your plan with your key stakeholders to get their input. • Explain why you have chosen the methods that you have. Convince them that you have good reason for choosing the methods that you did. • What concerns do they have about the plan? • Given your available time and resources, you need their help to prioritize what is most important to them. • It is often helpful to obtain approval of the plan in writing.

  38. Tying it all together Practice: • Share your plan with a neighbor. Pretend they are one of your key stakeholders. Get their input.

  39. Tying it all together Discussion question: • Is anyone willing to share their plan with the group?

  40. To implement basic evaluation methodologies Now that you have approval on your plan, you need to implement it. Next, you need to develop your instruments and implement them.

  41. To implement basic evaluation methodologies Considerations: • Before collecting your data, develop a data analysis plan. • Just as it is important to get approval on your evaluation plan before starting, it is also important to obtain approval on the instrument and data analysis plan. • Pilot testing is important – administer the survey/focus group/form to people similar to those who will receive it.

  42. To implement basic evaluation methodologies Existing Data • Can sometimes find a survey, focus group or other study that has recently been administered that covers the same topics that you are assessing • Several ways to access existing data at Stout: • Contact the BPA office • Survey Clearinghouse (in development) • Information Portal (in development) • Qualitative results: http://www.uwstout.edu/static/bpa/ir/surveylistqual.html

  43. To implement basic evaluation methodologies Surveys • Qualtrics is the online survey instrument that Stout uses: http://www3.uwstout.edu/bpa/survey/ • Qualtrics question library in development. Will be available from the above link. • Will provide suggested demographic questions • Will also provide suggestions for other standard questions • Typically don’t want it to take more than 10 minutes to complete – time to complete is more important than the number of questions.

  44. To implement basic evaluation methodologies Surveys, cont. • Look for other surveys you can model after – internally developed and externally developed. Request permission from the author to use some of the same questions. • Survey Clearinghouse (in development) • Surveys webpage: http://www.uwstout.edu/static/bpa/ir/surveylist.html • Sample survey guide: http://www.uwstout.edu/static/bpa/ir/afu/information/instruction.pdf

  45. To implement basic evaluation methodologies Focus Groups/Interviews • Different than a “listening session” Discussion question: How are focus groups different than listening sessions?

  46. To implement basic evaluation methodologies Core Ingredients • Continue groups until you hear same themes repeated – the ARC typically holds 8 sessions. Sometimes it is possible to do less. • No more than 12 people in a group • Circle seating • Typically requires formal training for facilitators/ moderators • Pre-determined questions • Systematic analysis

  47. To implement basic evaluation methodologies Inventories/Usage data/Observations • Must provide training to the people who will be entering data into databases or inventories. Include specific instructions. • Examples: • Sign-in sheets outside of a tutor center (sample in separate document) • Appointment logs • Formal observation forms, with checklists, open-ended questions, tally marks, etc… (sample in separate document)

  48. To utilize evaluation results Evaluation results are primarily used for two purposes: • Program/process/product improvement (formative) • Decisions about program/process/product continuation/discontinuation (summative) Both are important

  49. To utilize evaluation results Formative evaluation: • Typically done early in on program. Usually identifies improvements that can be made related to program implementation.

  50. To utilize evaluation results Formative evaluation, examples: • Decisions about improvements that can be made to get more people to attend a pre-college program. • Decisions about improvements that can be made to get more people to apply what they’ve learned from training programs. • Decisions about new ways to present or deliver departmental newsletters.

More Related