Loading in 2 Seconds...
Loading in 2 Seconds...
Engaging Your Community inPractical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)215-5297 email email@example.com
Background – East Tennessee Region, Tennessee Dept of Health • Mostly rural 15 county region in East Tennessee • Community Health Councils (HCs), facilitated by public health staff, established in each county • Implementing programs to address priorities developed through a “MAPP-like” community health assessment and planning process • Evaluation component missing from the outset
With no evaluation framework in place, would need to be retrospective HCs already active with current programs Negative perceptions about evaluation Adequacy of staff time and resources for facilitation and follow up Recognition of current and previous work of HCs Previous positive planning experience, HCs primed for MAPP Engage community members in a positive evaluation experience Utilize student resources in a partnership with MPH program Challenges/Opportunities
Six Steps of Evaluation • Engage stakeholders • Describe the program • Focus the evaluation design • Gather credible evidence • Justify conclusions • Ensure use and share lessons learned Adapted from: Centers for Disease Control and Prevention. Framework for Program Evaluation in Public Health. MMWR 1999;48(RR11):1-40
Setting • Health Council evaluation “retreats” • Facilitated and recorded by PH staff and MPH students • Lunch provided • Fun and interactive
Agenda • Background and overview of the evaluation framework • Review existing HC priorities and data used to develop priorities • Review recent data, if available • Group brainstorm to identify HC programs and projects to evaluate • Break into subgroups to design evaluation • Each participant leaves with a task
Evaluation Planning Worksheet • _______ County Health Council • To be completed by evaluation committees. • Name of Health Council program/initiative: _______________________________ • Describe the Program: • ·What priority or priorities does/did the program/project/initiative address? • ·Identify specific people, groups of people, or organizations who are/were served or affected by the program: • ·What are/were the anticipated outcomes of the program? • ·What are/were some specific activities of the program to achieve the desired outcomes? • ·What are/were current issue or trends affecting your program? • oLocal/State/National • Social/Economic/Political
Data Source Who will collect By what date • Focus on Evaluation Design: • ·What is the purpose of the evaluation? • oImprovement? • oAccountability? • oKnowledge? • ·Identify specific users of the evaluation findings (e.g., elected officials, funding sources, specific community groups) and how they will use the findings: • ·What data will you need to collect to measure the outcomes and activities or interventions of your program? • oFor each piece of data, specify a source and/or method of collection: • oIndicate WHO will be responsible for collecting each piece of data: • oConsider the accuracy and credibility of the data/source.
Evaluation Standards • Utility • Does the evaluation meet the needs of its users? • Feasibility • Is the design practical and non-disruptive? • Propriety • Are the methods and interpretation fair and ethical? • Accuracy • Are findings based on accurate and reliable information?
Gathering Credible Evidence • Indicators • Participation rates, client satisfaction, measured changes in attitudes or behaviors • Policy changes, partnerships formed, funds or resources leveraged • Sources • Interviews, documents (survey results, newspapers, meeting minutes), observations • Consider the quality and quantity of data • Logistics – Does the data already exist? How difficult will it be to collect or find?
Example • Priority Health Issue: Teen Pregnancy • Initiative: Teen Pregnancy Prevention Taskforce • Indicators: • Partnerships – 11 agencies and community groups • Resources – $25,000 local funds and $25,000 grant funds • Programs – • Case management to prevent 2nd pregnancy (12 participants) • STARS mentoring and education program (540 participants) • Policies – Curriculum policy (denied) • Prevalence Data - Decreased teen pregnancy rate
Follow up Meeting • Judgments on program performance based on standards and values of community stakeholders (HC members) • Share and discuss the information collected • Interpret • Determine practical significance of findings • Form judgments • Is the program worthwhile? Can it be improved? • Make recommendations
Sharing the Evaluation Results • Retreat often covered by local media • Brochures created for sharing information learned • Student project • Used as a promotional tool for HC in public meetings
Benefits and Lessons Learned • Community members learned first hand the importance of incorporating evaluation on the front end of project planning • Process re-energized HCs • Provided excellent segue into MAPP • Staff, HC and students gained experience with a flexible and accessible evaluation framework