1 / 34

Monitoring and Evaluation for FETPs Donna S. Jones, MD, MPH

Monitoring and Evaluation for FETPs Donna S. Jones, MD, MPH Division of Public Health Systems and Workforce Development. Center for Global Health. Division M&E Goal for FETPs. Improve monitoring and evaluation of field epidemiology training programs to support program improvement,

howardk
Download Presentation

Monitoring and Evaluation for FETPs Donna S. Jones, MD, MPH

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring and Evaluation for FETPs Donna S. Jones, MD, MPH Division of Public Health Systems and Workforce Development Center for Global Health

  2. Division M&E Goal for FETPs • Improve monitoring and evaluation of field epidemiology training programs to support • program improvement, • quality and • sustainability – • and thus to strengthen public health systems.

  3. Levels of FETP Monitoring/Evaluation • Critical Outcomes:The intended results of an FELTP implemented in an MOH • Multi-site FETP Evaluation:in development • Facilitated Self-Assessment (Scorecard): in use • Accreditation: in development • Program Monitoring: Standard information used to track program implementation and outputs, including tracking completion and quality of individual trainee competencies • Routine data reporting to Division

  4. Critical Public Health Outcomes • Increased Capacity of the Workforce • Strengthened Surveillance Systems • Improved Preparedness and Response • Expanded Collaboration and Networking Within and Across National Boundaries • Enhanced Effectiveness of Policies and Practice

  5. Multi-site Evaluation • Last done in 1998 by Battelle • Meeting last year to draft quality indicators • Funded this year with end of year funds • Up to 10 sites • Expert consultants meeting this week

  6. Planned multi-site evaluation draft quality measures – • Mentored Public Health Activities in Applied Epidemiology • Ratio of time spent on mentored public health activities in applied epidemiology vs. classroom • Access to and use of public health surveillance data • Utilization of public health surveillance data for action • Access to participate in variety of relevant outbreak investigations • Access to participate in relevant epidemiologic assessment of other public health priorities • Utilization of field activity results as a basis for public health action • Number of abstracts and manuscripts accepted to meetings and journals • Quality of abstracts

  7. Draft quality measures – planned multi-site evaluation – cont. • Service and Value to Ministry of Health or other primary government public health institution (MOH*) • Impact of field activity results: e.g.policy/ intervention created/modified as a result of FETP activities. • Requests for assistance by MOH* and national programs • Workforce Strengthening • Graduates in relevant positions of country health system over time • Supervision and Mentoring • Sufficient program staff resources for supervision and mentoring • Guidelines and orientation for supervision and mentoring • Periodic systematic evaluation of supervision and mentoring • Trainees completing the required activities within the required time

  8. FETP Facilitated Self-Assessment Tool “Scorecard”

  9. Overview • Developed to assist countries to look at strengths and weaknesses • Help programs focus on priority areas for improvement • Consider a common vision of successful FETP • Work collaboratively with other programs

  10. Prior problems • Simple counting insufficient • Results long in coming • Results not useful • Program not actively engaged in process

  11. Major Categories • Competency-based Curriculum • Field Activities • Leadership Development • Program Management • Sustainability

  12. Matrix Tool for FETP Assessment Capabilities - Indicators 1. Competency-based Training A. Competency-based curriculum B. Resident/Officer Assessment C. Mentor Support D. Mentor Assessment E. Coursework Faculty F. Field Sites/ Work Sites G. Graduates

  13. Matrix Tool for FETP Assessment Capabilities - Indicators 1. Competency-based Training A. Competency-based curriculum B. Resident/Officer Assessment C. Mentor Support D. Mentor Assessment E. Coursework Faculty F. Field Sites/ Work Sites G. Graduates 2. Public Health Work/Field Activities A. Outbreak Investigations B. Surveillance C. Public Health Studies D. Communications

  14. 3. Public Health Leadership Development A. Public health leadership B. Ministry of Health retention/Career progression 4. Management A. Policies and procedures B. Resident/Officer Selection C. Staffing D. Office logistics/infrastructure E. Course work logistics F. Field Work logistics 5. Sustainability A. Support by Ministry of Health B. Sustainable leadership for program C. Graduate network D. Planning E. Partnerships F. Advisory Board/Steering Committee G. Advocacy for Program

  15. 3. Public Health Leadership Development A. Public health leadership B. Ministry of Health retention/Career progression 4. Management A. Policies and procedures B. Resident/Officer Selection C. Staffing D. Office logistics/infrastructure E. Course work logistics F. Field Work logistics 5. Sustainability A. Support by Ministry of Health B. Sustainable leadership for program C. Graduate network D. Planning E. Partnerships F. Advisory Board/Steering Committee G. Advocacy for Program

  16. 3. Public Health Leadership Development A. Public health leadership B. Ministry of Health retention/Career progression 4. Management A. Policies and procedures B. Resident/Officer Selection C. Staffing D. Office logistics/infrastructure E. Course work logistics F. Field Work logistics 5. Sustainability A. Support by Ministry of Health B. Sustainable leadership for program C. Graduate network D. Planning E. Partnerships F. Advisory Board/Steering Committee G. Advocacy for Program

  17. Implementing Assessment • Country program interest • External and internal teams identified • Arrange for 4-5 day visit (logistics) • Program compiles requested documentation

  18. Implementing Assessment • External team interviews key groups (fellows graduates, former directors, supervisors, mentors) • Program and team review diagnostic tool together agree on level of achievement, documentation reviewed • Agreement on key items to be worked on in short, medium and long term

  19. Typical Assessment Week Mon Tue Wed Thu am Orientation Interviews Interviews Documentation pm Interview Interviews Workshop Preliminary Report • Focus group interviews • Scorecard workshop (coordinating staff +) • Preliminary report left in country (briefing)

  20. Thoughts on why it works • Focused, intense effort, easily understood process • Billed as a “self assessment” NOT evaluation • Process is itself networking/advocacy • We leave a Report, a “roadmap” for improvement

  21. Accreditation • Process in development with TEPHINET • Standards being drafted • Focus on core processes such as • Program infrastructure • Program management • Standard procedures • Routine evaluation procedures for residents, faculty, curriculum and program

  22. Draft Accreditation Standards Staffing (director, staff, supervisors/mentors) • The program has a host country director/coordinator who is assigned full-time to the FETP. • The program has adequate technical, administrative and clerical staff for the effective administration of the program and support of the FETP’s trainees. • The program has mentors trained in the goals of the program and in field epidemiology methods who provide on-the-job supervision to the trainees. • All mentors/technical supervisors are adequately qualified, e.g. graduates of the program or similar applied epidemiology program and this qualification is documented.

  23. Draft Accreditation Standards Administrative and logistic support • The program has dedicated office space in which the staff and trainees can work. This includes having basic office supplies such as telephone/computers/e-mail/fax/internet available for use. • Trainees have ready access to reference material in print or electronic format, including the capability to search electronic medical literature databases. • The program has access to necessary supplies and logistical support for field investigations. • The program has access to laboratory services in support of investigations as appropriate.

  24. Program Monitoring • What should you be doing routinely? • Are you doing it?

  25. Minimum requirements for FETP core functions: • Expected competencies are documented. • All competencies have method to determine achievement of that competency that can be tracked. The method includes a measure for quality of work. • Trainees have regular access to trained mentors who can provide sufficient and adequate supervision for the public health work. • Supervisors and trainees meet routinely (?weekly, ?monthly) to review their work and other topics of interest. • Trainee assessment occurs routinely and is documented. Progress is reviewed with trainee and supervisor at least every 6 months. • Program reviews curriculum and training program each year and changes/improvements are documented.

  26. Possible Routine Reports from Residents and Supervisors • Monthly report of activities – provided to field site supervisor, technical supervisor with cc to program director, and RAs. • Quarterly report by resident of progress toward completion of required field activities. Include self-assessment and evaluation of supervisor. • Quarterly performance evaluation by field site supervisor • Quarterly (or 6 monthly) progress assessment and development of next quarter workplan with FETP staff supervisors and RA • Documentation for required activities (this will be program dependent)

  27. FETP Quarterly Program Monitoring • Quarterly resident assessments • Quarterly report submitted • Quarterly supervisor assessment received • Quarterly assessment conducted with participant • Improvement plan (or revised workplan) • Regular meetings (weekly/monthly) • Meetings held regularly • Appropriate MOH staff in attendance •  Support to Ministry of Health • List requests for assistance and program response. •  Recommendations from field activities presented to appropriate level of MOH

  28. Suggested accreditation standard • The program must monitor and track each of the following areas: • trainee performance, • mentor/supervisor/faculty performance, • field assignment performance, • graduate performance, • program quality.

  29. Annual program review • Quarterly monitoring information • Review of number, type, adequacy of outbreak investigations • Analysis and use of surveillance data • Improvements to surveillance systems • Uptake of recommendations (actions/policies, etc.) • Proportion of residents on track to complete competencies • Communication products (bulletins, abstracts, presentations, manuscripts) • Review of training gaps identified in field work and plans to address • Advisory committee activities • Review of field sites (support, supervision, routine work, access to data, projects undertaken, use of findings)

  30. Routine Reporting to Division • Before – Yearly annual report request • Now – In process, but expect routine (monthly or quarterly) updates on core information, e.g. • New cohorts • Graduates • Outbreaks • Presentations • Publications • Other resident activities • Other trainings • Etc.

  31. Questions?

  32. Quality: System, Program, Individual System Program Individual

More Related