1 / 23

What Do Programs Want?

What Do Programs Want?. Jeanette Treiber, PhD Project Director Center for Program Evaluation and Research UC Davis School of Medicine. Objective. Determine local agencies’ evaluation capacity building needs. Background.

kent
Download Presentation

What Do Programs Want?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. What Do Programs Want? Jeanette Treiber, PhD Project Director Center for Program Evaluation and Research UC Davis School of Medicine

  2. Objective Determine local agencies’ evaluation capacity building needs

  3. Background • California Department of Public Health (CDPH)’s Tobacco Control Program is carried out by • 61 Public Health Departments • Approximately 40 Competetive Grantees

  4. CDPH Strategy Norm Change Community base Community Assessments 3yr work plans Evaluation More Background

  5. Evaluation Components Local Program Evaluation requirements • Comprehensive Evaluation Plan • Process and Outcome Evaluation • Internal or external evaluators • Progress Reports • Final Evaluation Reports

  6. Tobacco Control Evaluation Center (TCEC) TCEC provides Evaluation Capacity Building services to California TC programs through: • TA • Training • Tools • FER scoring and feedback

  7. Rationale for TCEC Services “…pressures have been accompanied by increasing awareness (and often frustration) on the part of funders as they confront the limited ability of the organizations they fund to meet the challenge of evaluating programs in ways that are likely to yield either genuine outcome-oriented accountability or useful information for formative improvement programs”Stevenson et.al. 2002

  8. Evaluation Capacity Building • Observed need and response in past decade • Various approaches to CB • Various definitions • Various approaches to measuring effect of CB • Still little research

  9. Method • A 21 item electronic survey (using the online service surveymonkey.com) to a list of 170 project directors and evaluators. • Analysis of access database for technical assistance from 2004 to 2009 (approx. 4.5 yrs)

  10. Results (survey) Need for training or materialsHigh orModerate Need Using culturally competent evaluation methods ……………….79.4 % Writing evaluation plans ……………………………………….59.9 % Adapting/developing data collection instruments ……………...57.1 % Interpreting evaluation results…………………………………..54.1 % As a reference during evaluation activities…………...................51.5 % Writing evaluation reports……………………………………….50 % Collecting evaluation data……………………………………….42.9 %

  11. Projected Use of TCEC (survey) Projected useResponse Percent Finding, developing, or adapting data collection Instruments………………………………………..68.6 Writing an evaluation plan ……………………….57.1 As a reference during evaluation activities ………………………………48.6 To build overall evaluation capacity of the project …………………………….42.9

  12. Preferred Format (survey) Webinar 68 % “How-to” evaluation guides on website 57.1% Individual consultation by phone or e-mail 45.7% Website strategy exchange 42.9%

  13. Requests for Assistance (TA Log) Total number of TA requests between 11/2/04 and 7/16/09 512 Average: 11 requests/month Up to 27 requests/agency total

  14. Main Request types (TA Log)

  15. Type of Data Collection Instrument 189 44 39

  16. TA Demand (79%) (71%)

  17. FER Score and TA requests Final Evaluation report scores for Local Lead Agencies were correlated with frequency of TA request. No statistically significant correlation was found.

  18. Summary/Conclusion • Majority of local health departments AND community organizations in CA TC require evaluation assistance • Greatest need is with DCIs, developing Eval plans, and help conducting evaluation with diverse populations. • Most DCI dev. need is with surveys • Webinars are the most desired mode of training delivery • Individual assistance requests vary greatly in frequency, scope, and content

  19. Limitations • Response rate to survey was only 20% • TA assistance was not consistently entered into log in the same manner • Requests for TA are somewhat dependent on marketing of TA • TCEC team has changed personnel throughout the 4.5 years

  20. What do Programs Want? • Flexible, • Varied • Evaluation • Assistance; • Convenient • Delivery

  21. References California Department of Public Health Services, Tobacco Control Program. http://www.cdph.ca.gov/programs/tobacco/Pages/default.aspx Innovation Network – Transforming Evaluation for Social Change. http://www.innonet.org/ Preskill, Hallie. (2005). Building Evaluation Capacity. 72 Activities for Teaching and Training. Thousand Oaks (Sage).

  22. More Resources Stevenson, John, Paul Florin, Dana Scott Mills, and Marco Andrade. (2002). Building Evaluation Capacity in Human Service Organizations: A Case Study. Evaluation and Program Planning 25, 233-243. Tobacco Control Evaluation Center (TCEC). Center for Program Evaluation and Research, UC Davis. http://programeval.ucdavis.edu University of Wisconsin Extension. Program Development and Evaluation. http://www.uwex.edu/ces/pdande/evaluation/index.html

  23. Contact Jeanette Treiber, PhD Project Director Center for Program Evaluation and Research Tobacco Control Evaluation Center UC Davis School of Medicine jtreiber@ucdavis.edu http://programeval.ucdavis.edu 1616 Da Vinci Court Davis, CA 95618

More Related