1 / 14

Monitoring scaling up of AFSH

Monitoring scaling up of AFSH. Monitoring the implementation of national standards for AFHS. Dr Paul Bloem World Health Organization April 2008. What do we try to monitor?. We want to ensure that standards are implemented in districts & health facilities, according to plan

skyler
Download Presentation

Monitoring scaling up of AFSH

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring scaling up of AFSH Monitoring the implementation of national standards for AFHS Dr Paul Bloem World Health Organization April 2008

  2. What do we try to monitor? • We want to ensure that standards are implemented in districts & health facilities, according to plan • We want to check whether this indeed leads to changes in the quality of service provision • We want to find out if this is accompanied by increase in service utilisation • We want to be able to show that these changes have been contributed to by the application of the AFHS standards

  3. 1. We want to ensure that standards are implemented in districts & health facilities, according to plan • Why? Because high coverage with the standards is the explicit aim of this effort (not pilot projects but the application of national standards every where), we need to track where exactly implementation of the AFHS standards is taking place • What ? We need to track which (% of) provinces & district are supporting AFHS activities & monitoring them, & how many (% of ) sites in each district are becoming more AF

  4. 2. We want to check whether this indeed leads to changes in the quality of the service provision • Why? The premise is that improved quality will lead to increased use of services by adolescents. A standards-driven quality improvement approach is the vehicle chosen to deliver improved quality. We need to be sure that quality is indeed improving in the AFHS sites. • What ? So we need to track the outcomes of the work done in the services delivery points to ensure that quality is improving.

  5. 3. We want to make sure this goes accompanied by increase in service utilisation • Why? AFHS are being set up to remedy the fact that few young people use health services. So it is imperative to be able to show that young people who need services are accessing them in larger numbers. • What ? Every site that aims to become AFHS needs to track the number of young clients so that at a minimum, districts need to collect data on how many young people access these sites. If possible the data gathered by the HMIS (health management information system) should be disaggregated in a way that allows one to track young people's service use in all sites.

  6. 4. We want to be able to show these changes can be attributed to the application of the AFHS standards • Why? To scale up AFHS & get increasing numbers of adolescent clients to using AFHS in the country, will need actions to be carried out & paid for through the existing health system. Tough questions will therefore be asked by those who are convinced that this is worth their effort (e.g. district medical officers & healthy facility managers), as well as those who decide on allocating funds (health & financing authorities at provincial & national levels, including donors). • What ? To demonstrate the effects of AFHS are not due to chance some comparative research needs to be carried out to show beyond doubt that the AFHS standards application produced the increased utilisation.

  7. Two types of monitoring processes • Facility-level monitoring to track implementation & progress made • Programme level monitoring to track how implementation is being rolled out & scaled up

  8. Standards Monitoring – Option1 Option 1 • Each standard is an indicator • All criteria are measured and included in the calculation of the indicator for each standard Example of principle: National Adolescent Friendly Clinic Initiative, South Africa: 10 indicators, composed of a total of 41 weighted criteria

  9. Example how standards can be tracked

  10. Standards Monitoring – Option 2 Option 2 • Limited set of tracer indicators • Used for quality assessment & accreditation Example India, Ambala, Haryana State      

  11. Programme level Monitoring • Select a limited set of strategic indicators • They should provide guidance that "good things" are implemented in many more places ("scale and quality") • These indicators should be actively measured at district, provincial & national levels (as per the national situation) • Periodicity: quarterly

  12. Programme level Monitoringexample indicators Scale of implementation - inputs • % of SDPs with at least one trained service providers • % of sites that received supportive supervision in last 3 months • % of districts with approved budget line for AFHS • Total district level funds allocated to AFHS Quality of implementation - outcomes • Services delivered according to national guidelines • % of SDPs that have been accredited AFHS Coverage & Utilization • Quarterly trends in overall service utilisation • Quarterly trends in new clients visits SDP= Service Delivery Point

  13. A "Dashboard" approach should be used at national & district level to track this set of key indicators - visually & continuously -

  14. In addition "Intensified Monitoring" Intensified monitoring to document and evaluating impact • Externally facilitated baselineusing quality monitoring toolkit in selected sites in 3 districts • Compare with control sites in same or similar districts • Follow up every 3/6 months for 2 years • Conduct yearly community level "coverage measurement" to assess levels of access

More Related