1 / 54

Evaluating Community-based Initiatives

Evaluating Community-based Initiatives. May 3, 2005 12:00 noon CST Holly Ruch-Ross, ScD David Keller, MD FAAP Thomas Young, MD FAAP. Declaration of Vested Interest. Holly Ruch-Ross has no interests to declare David Keller has no interests to declare Thomas Young has no interests to declare.

denali
Download Presentation

Evaluating Community-based Initiatives

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluating Community-based Initiatives May 3, 2005 12:00 noon CST Holly Ruch-Ross, ScD David Keller, MD FAAP Thomas Young, MD FAAP

  2. Declaration of Vested Interest • Holly Ruch-Ross has no interests to declare • David Keller has no interests to declare • Thomas Young has no interests to declare American Academy of Pediatrics

  3. Evaluation Overview Holly Ruch-Ross, ScD

  4. Presentation Objectives At the completion of the presentation, the listener will be able to: • Identify the distinguishing characteristics of evaluation research and the benefits of conducting evaluation. • Write clear objectives. • Identify significant evaluation challenges and strategies to address them. American Academy of Pediatrics

  5. Defining Evaluation Research: to conduct a careful, patient, systematic, diligent inquiry or examination in some field of knowledge, to establish facts or principles; to laboriously or continuously search after truth. Evaluate: to determine the worth of; to find the amount or value of; to appraise. American Academy of Pediatrics

  6. Defining Evaluation • Evaluation is action research, intended to provide information that is useful for: • Program development and improvement • Program replication • Resource allocation • Policy decisions. American Academy of Pediatrics

  7. Reasons to Evaluation • Checking Your Process: Are you doing what you said you would do? • Determining Your Impact: Are you having the desired effect in the target population? • Creating a Fan Base: Can you generate information and evidence to share with funders and other stakeholders? • Replication Justification: Is there evidence to support replication of this program? • Effective Documentation: Can you collect information to support your program and meet funder or other requirements? American Academy of Pediatrics

  8. The Evaluation Cycle START Plan program and evaluation Adjust program; Refine evaluation Implement program and begin to collect evaluative data Review data. Are you doing what you planned? Having the intended effect? American Academy of Pediatrics

  9. Process Is the program being implemented the way it was designed? Outcome Is the program having the intended effect? Types of Evaluation American Academy of Pediatrics

  10. Process Evaluation Information Needs • Describe the program and implementation, who participates in the program, what services are received. • Information such as number served, patient characteristics, number of contacts with a program, number of trainings, number of referrals. American Academy of Pediatrics

  11. Outcome Evaluation Information Needs • Detect whether the intervention made a difference, what changes can be measured (knowledge, attitude, behavior, health status, incidence, prevalence) • Longer term outcomes may need to be assessed using shorter term indicators. American Academy of Pediatrics

  12. Outcomes and Indicators: Examples American Academy of Pediatrics

  13. Goals and Objectives • Goal: broad statement of what the program would like to accomplish for a specific target population. • Objective: measurable step toward the achievement of a goal. • Who will do • What by • When American Academy of Pediatrics

  14. A GOOD OBJECTIVE IS SMART: • Specific • Measurable • Achievable • Realistic for the program • Time specific American Academy of Pediatrics

  15. Evaluation Challenges: Finding the Right Tools • Tools need to measure the right construct. • Ideally, tools should be pre-existing and well-established (valid, reliable and standardized). • Tools must be appropriate for the target population (in terms of age, culture, language, literacy, other issues). • Tools must be easy to administer in the setting. • When existing tools are used, they must be readily available, affordable, and supported by the author. American Academy of Pediatrics

  16. Evaluation Challenges: Designing Your Own Tools • Adapt an existing tool to be more appropriate for target population • Review literature • Talk to other grantees • Talk to others with ideas about what you should ask about: experts, staff, recipients of services • Pilot test tools with representatives of the target population. American Academy of Pediatrics

  17. Evaluation Challenges: Using Qualitative Data • Gain insight into feelings, attitudes, opinions and motivations. • Study selected issues in depth and detail. • Gather the broadest response possible without predetermined categories. • Gain rich information about a small number of people and cases. • Put a human face on the program. American Academy of Pediatrics

  18. Most Common Qualitative Data Collection Strategies • In-depth interview: usually one-on-one structured conversation with a person who has important information about the program. • Focus group: professionally facilitated, focused discussion among selected individuals about program issues. Not all discussion groups are focus groups, but they may still be useful. American Academy of Pediatrics

  19. Evaluation Challenges: Data Collection • The sophistication of data collection should be appropriate for the scale of the project. • Plan data collection up front, including who, what and when. • Have a system in place for tracking participants (particularly if follow up is planned). • Identify the staff person responsible for data handling. • Protect participant confidentiality. • Do not collect information that you will not use. American Academy of Pediatrics

  20. Evaluation Challenges: Data Management and Analysis • Budget for expenses associated with data entry and analysis. • Have a strategy for data management up front. Begin data entry immediately. • Remember that data analysis should follow directly from the questions you are trying to answer about your intervention. • Know what comparative information may be available. American Academy of Pediatrics

  21. Evaluation Challenges: Technology • Involve technology as early as possible in the planning process. • Keep it as simple as possible. • Choose technology that will be around at least five years, and that is compatible with your current system. • Make certain that your choices match the technical support that is available to you. American Academy of Pediatrics

  22. Evaluation Challenges: Getting Help From an Evaluator • Specific evaluation training and applied research experience • Experience in a human service setting • Professional perspective and methodological orientation • Interpersonal skills • Self interest (i.e., can he/she put yours first?!) American Academy of Pediatrics

  23. CATCH Grant Evaluation Example Thomas Young, MD FAAP

  24. Presentation objectives At the completion of the presentation, the listener will be able to: • Describe a CATCH Project • Describe project’s evaluation plan • Describe project’s data collection and management techniques American Academy of Pediatrics

  25. Keys to Successful Evaluation • Develop objectives that directly contribute to your goals for project • Spend time and thought on objectives, evaluation will follow • Objectives must be • Relevant • Measurable • Feasible/doable American Academy of Pediatrics

  26. CATCH Project Description • CATCH Implementation Grant 2003 • Goal: To improve screening, provider knowledge and access to mental health care for children and families in a primary care and continuity clinic setting • This is broad goal for a small grant • We had to develop objectives that were relevant, measurable, and doable. American Academy of Pediatrics

  27. Developing CATCH Objectives • Process Objective 1: 80% Implementation of Bright Futures mental health screening: PEDS, Ped Symptom Checklist, Edinburgh Postnatal Depression Screen • Evaluation Method: Chart audit • Findings: Chart audit documented successful implementation of screenings in 90% charts American Academy of Pediatrics

  28. Developing CATCH Objectives • Process Objective 2: To provide 120 mental health consultant visits • Evaluation Method: A data file was developed and completed by the Mental Health consultant to document the number and type of mental health encounters. • Findings: 297 MH sessions American Academy of Pediatrics

  29. Data File and Report American Academy of Pediatrics

  30. Developing CATCH Objectives • Process Objective 3: To provide case management of mental health referrals • Evaluation Method: Data base developed with Excel to track referrals, patient contacts • Findings: 83 children had outside referrals and case management documented. American Academy of Pediatrics

  31. Developing CATCH Objectives • Outcome Objective 1: To improve provider confidence in managing behavioral problems in children. • Evaluation Method: Development of a pre/post survey of the providers in clinic on confidence in management of common behavior problems (no existing survey tool found) • Findings: Confidence increased from 40% to 67% in 6 months. Adequate MH resources improved from 54% to 95%. American Academy of Pediatrics

  32. American Academy of Pediatrics

  33. Developing CATCH Objectives • Outcome Objective 2: Identify resources to sustain mental health provider • Evaluation: Was funding found to continue mental health project • Findings: Primary Care Center funding, grants, and partnership with community mental health agency continued and expanded MH services. American Academy of Pediatrics

  34. Summary of CATCH Evaluation • Evaluations are different from scientific research • Good objectives lead to good evaluations • If possible find reliable tools to measure outcomes, if not use creative common sense. • Accurately track data pre-grant to the end • Keep it relevant but simple, with larger follow-up grants hire an evaluator! American Academy of Pediatrics

  35. Healthy Tomorrows Grant Evaluation Example David Keller, MD FAAP

  36. Presentation objectives At the completion of the presentation, the listener will be able to: • Describe a Healthy Tomorrows Project • Describe project’s evaluation plan • Describe project’s data collection and management techniques American Academy of Pediatrics

  37. Healthy Tomorrows Project Description:Family Advocates of Central Massachusetts • A legal-medical collaborative • Focused advocacy on legal issues likely to affect child health outcomes • Three objectives: • Patient identification and referral from practices • Provider training on legal issues and how they affect health • Advice and counsel for patients and families in need American Academy of Pediatrics

  38. A medical-legal collaborative Legal Assistance Corporation of Central Massachusetts and five practices in Central Massachusetts • Worcester • Pediatric Primary Care • Family Health Center • Webster • South County Pediatrics • Milford • Milford Pediatrics • Fitchburg • CHC Family Health Center American Academy of Pediatrics

  39. Focused Advocacy • Housing stability (e.g. lead poisoning, homelessness, mold and allergens) • Financial security (e.g. disability benefits, food stamps, Medicaid) • Dignity and safety (e.g. immigration status, domestic violence) • Access to services (e.g. medical, dental, mental health, special education services) American Academy of Pediatrics

  40. Model of Intervention American Academy of Pediatrics

  41. Process Evaluation • Based on program objectives: • Are the practices identifying and referring patients in need of legal services to the program? • Are we able to train medical providers on legal issues and how they affect health? • Have we provided advice and counsel for patients and families in need of legal services? • Underlying question: • What can we do better? American Academy of Pediatrics

  42. Process Objective 1: Are the practices identifying and referring patients in need of legal services to the program? • Findings in Year 1: 76 referrals • Came from all five sites. • Usually associated with trainings. • Office hours did not generate referrals. • Lessons learned: • Time with providers = more referrals. • Screening instrument needed. American Academy of Pediatrics

  43. Process Objective 2: Are we able to train medical providers on legal issues and how they affect health? • Year 1: 42 trainings • Webster: 9 • Worcester FHC: 7 • Worcester Peds Assoc: 15 • Fitchburg: 6 • Milford: 5 • Feedback • “Code card is great” • “More time on cases” • Lessons learned: • Providers want practical tools • Providers want case-based training Satisfaction? American Academy of Pediatrics

  44. Process Objective 3: Have we provided advice and counsel for patients and families in need of legal services? • Findings: 58 cases closed for 50 clients. • Counsel and advice in 21 cases. • Brief service in 26 cases. • Full representation in 10 cases. • Referred 1 clients to another agency. • Lessons learned: • Case mix more varied than expected based on needs assessment. • Less full representation needed than we expected. American Academy of Pediatrics

  45. Outcome Objectives: ??? Practice Outcome Legal Outcome Health Outcome System Outcome American Academy of Pediatrics

  46. Legal Outcome: Evaluation Methods Was the case resolved in favor of the client? Is that client better off than before in terms of: • Housing stability • Financial security • Dignity and safety • Access to services From Nasdor, Moving Beyond Funder-Driven Outcome Measures. LACCM database LACCM outcome tool American Academy of Pediatrics

  47. Legal Outcome: Problems What is a case? • Need to understand “jargon” of legal services. What is an outcome? • Need to integrate public health and legal data • Need an outcome for “brief intervention” Validity of outcome measure? • Need to assess inter-observer reliability American Academy of Pediatrics

  48. Practice Outcome • How have the attorneys changed their practice? • How have the physicians changed their practice? • Is there a more collaborative relationship? American Academy of Pediatrics

  49. Practice Outcome: Problems • Counting: • No ready made database • Evaluation: • Utility of self-reported comments • Data entry • Next steps: • Commitment to change • Focus groups on practice change American Academy of Pediatrics

  50. Health Outcome • What is health? • Individual or group? • Compared to what? • Measured how? • What about the confounders? American Academy of Pediatrics

More Related