1 / 29

Evaluating Your Program: How Do You Know If It Worked?

Evaluating Your Program: How Do You Know If It Worked?. Susan Donaldson Cooperative Extension Bringing the University to You. What is the purpose of the evaluation?. To improve a program, or your own teaching To help others (including a funding agency) understand the program and its results

hayward
Download Presentation

Evaluating Your Program: How Do You Know If It Worked?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Your Program:How Do You Know If It Worked? Susan Donaldson Cooperative Extension Bringing the University to You

  2. What is the purpose of the evaluation? • To improve a program, or your own teaching • To help others (including a funding agency) understand the program and its results • To guide future actions/programs • To determine if the program is worth the cost • To win support and/or funding • For academic interest (!)

  3. PLAN THE EVALUATION AT THE SAME TIME AS YOU PLAN THE PROGRAM! What do I want to know, and how will I know it?

  4. Design your program with specific learning or action objectives in mind • What results do you want from your program? • Knowledge (understanding) • Beliefs, attitudes and opinions (psychological states) • Behavior (What actions do you want them to take?)

  5. Outputs are not the same as outcomes or impacts • Outputs are things you produce, including brochures, other printed materials, numbers of events, etc. • Grantors often need these numbers • Producing a brochure does NOT guarantee learning or outcomes!

  6. Some questions you might ask… • What do people do differently as a result of the program? • Who benefits, and how? • Are the program’s accomplishments worth the resources invested? • What are the social, economic, environmental impacts?

  7. Some questions you might ask… • What are the strengths and weaknesses of the program? • How well does the program respond to the need that caused you to design it? • How efficiently are resources being used? • How well does the program fit in the local setting?

  8. What information do you need to answer the questions? Indicators: how will I know success when I see it? • Participation • Reactions (maintain interest, accept leadership responsibility) • Learning (sufficient knowledge and skills developed) • Actions (knowledge is applied; plans are made; collaboration occurs) • Impact (successful weed management planning and implementation occurs; decrease in infested acres, etc.)

  9. Kinds of information • Numerical (statistical) • Narrative What will the recipients consider to be credible information? What best meets their need?

  10. Sources of information • Surveys • Observations • Interviews • Tests • Group techniques • Case studies • Photographs • Document review • Testimonials • Etc…

  11. The Truckee Meadows example Identify the audiences: • City and County government • Citizens and neighborhood advisory board members • The general public

  12. City and County Government • Message: We have a problem. You have a legal responsibility. Failure to take action will increase future costs. • Desired action: Form a functional CWMA • Hook: Forming a CWMA will help get funding.

  13. City and County Government Evaluation: PARTICIPATION • Does CWMA form? • Do city and county personnel participate fully? REACTION • Does participation continue over time? LEARNING • Is the ability to manage weeds improved? ACTIONS • Do collaborations develop among entities? • Is budget allocated to weed management? IMPACT • Are weeds managed?

  14. What happened? • 36 city, county, agency, and citizen representatives attended the first meeting • Attendees agreed that a CWMA was needed • 12 weeds of concern were prioritized for group effort • Attendees decided to continue to meet

  15. What happened (outcomes)? • Staff now attend annual trainings • An MOU and strategic plan were developed • Washoe County and City of Reno are collaborating • County has hired a Natural Resources Manager; position will coordinate weed management • Grants were obtained • Seasonal mapping staff were hired • Control budgets are still not adequate • Issue is easily de-prioritized

  16. Indicators PARTICIPATION • Does CWMA form? (YES) • Do city and county personnel participate fully? (NO) REACTION • Does participation continue over time? (YES) LEARNING • Is the ability to manage weeds improved? (SOMEWHAT) ACTIONS • Do collaborations develop among entities? (YES) • Is budget allocated to weed management? (NOT ENOUGH) IMPACT • Are there fewer weeds? (COULD BE A LOT BETTER!)

  17. CAB & NAB members • Message: Weeds are affecting your neighborhood, and you can help fight them • Action: Incorporate weed cleanups into neighborhood cleanup days • Hook: We’ll supply bags, information and help • Reaching the audience: Personal contact with each group via a “canned” presentation at a regular meeting

  18. CAB & NAB members Evaluation tools: • Number of “champions” recruited (PARTICIPATION) • Number of projects incorporating weeds (ACTIONS) • Neighborhoods have fewer weed problems (IMPACT)

  19. The campaign: • Presentations were made at 12 NABs and CABs in April and May reaching 240 people • Each group was given a binder of information • One “champion” was recruited; many asked for information and assistance • Number of events: one in 2004, five in 2005

  20. General public • Message: Weeds are ruining your neighborhood • Action: Identify, pull, and dispose of yellow starthistle) • Hook: The “ouch!” factor – this affects YOU!

  21. General public • Methods: PR campaign using movies, buses, hotline • Money: NDOA grant • Evaluation tool: Number of calls to hotline (LEARNING & ACTION) • Future evaluation tools: Hits on website, reports filed on website, random phone survey (need funding)

  22. What happened? • A $10,000 grant was obtained for an education and outreach campaign • Campaign elements included: • presentations to advisory boards • promotion of a weed hotline • bus placards • slides in movie theaters

  23. 2004 PR campaign • Washoe County donated a phone line for a hotline • Hotline magnets distributed at public events • Advertised via news releases and PSAs • Volunteers agreed to transcribe messages on a monthly basis

  24. Bus Placards 9 buses for one month (potential viewership 150,000) Cost: $1566.34

  25. Movie Slides 3 different slides shown on half of all Reno/Sparks screens in movie theaters (21 total) for 6 weeks

  26. Movie Slides Potential viewership = 126,600 people

  27. Movie Slides Number of times repeated = 8,820/slide Cost: $2370.50

  28. Hotline Calls Logged by Month Number of Calls

  29. Did it work? • Increase in hotline calls was small relative to amount of PR • Calls to hotline dropped off once PR stopped (and fall came) • Need better method to evaluate public awareness • Campaign elements were changed for 2005

More Related