Introducing the multi indicator version of the rdqa tool
1 / 29

Introducing the Multi-Indicator Version of the RDQA Tool - PowerPoint PPT Presentation

  • Uploaded on

Introducing the Multi-Indicator Version of the RDQA Tool. Presented at the MEMS - MEASURE Evaluation Brown Bag, Abuja December 7, 2012. Background - 1. National programs and donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and malaria.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about ' Introducing the Multi-Indicator Version of the RDQA Tool' - alder

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Introducing the multi indicator version of the rdqa tool

Introducing the Multi-Indicator Version of the RDQA Tool

Presented at the

MEMS - MEASURE Evaluation Brown Bag,


December 7, 2012

Background - 1

  • National programs and donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and malaria.

  • Measuring success and improving management of these initiatives are based on strong M&E system that produce quality data regarding program implementation.

  • As a result of strategies like “Three Ones”, the “Stop TB Strategy” and the “RBM Global Strategic Plan”, a multi-partner project* was launched in mid-2006 to develop a joint Routine Data Quality Assessment (RDQA) Tool.

  • The objective of this initiative was to provide a common approach for assessing and improving data quality (between partners and with National Programs).

* Partners most directly involved include PEPFAR, USAID, WHO, Stop TB, the Global Fund and MEASURE Evaluation.

Background - 2

  • Importantly, funding is tied to performance and need to show effectiveness of interventions

  • Hence, the need for quality data is imperative to show program effectiveness

  • Interestingly, single indicator-RDQA tool was used for Joint national DQA exercises in 2008, 2009, 2011 and 2012 (led by NACA)

  • Multi-indicator tool has never been used in the country and there is a need to sensitize M&E professionals of the potential opportunities in this tool

  • This tool provides opportunity to evaluate data quality for selected priority indicators in different program areas at the same time and identify areas for improvement

Countries where rdqa has been used or is currently being implemented
Countries where RDQA has been used or is currently being implemented

  • Kenya

  • Tanzania

  • South Africa, Lesotho, Swaziland

  • Nigeria

  • Cote d’Ivoire

  • DRC

  • Haiti

  • Mozambique

  • India

  • Botswana

  • Global Fund On Site Data Verification (OSDV) by LFAs in many countries

Data Quality implemented

  • Refers to the worth/accuracy of the information collected & focuses on ensuring that the process of data capturing, verifying and analysis is of a high standard.

  • RDQA tools facilitate this process and also provide opportunity for capacity building

Why Data Quality is Important - I implemented

  • Mistake should be prevented rather than detected

  • Correcting data that has been wrongly recorded is difficult and expensive

  • The qualityof the data is largely determined by how well the data are collected and forms are completed

  • In the presence of errors, data cannot be interpreted – useless!

  • Increased Data Quality  Increased reliability and usability

Why Data Quality is Important - II implemented

  • Program planning

  • Data use

  • Program decision making

  • Sharing program information

  • Reporting/Accountability

Data Quality Assurance implemented

  • Data Quality Assessment involve checking data against several criteria/dimensions

    • Validity

    • Integrity

    • Reliability

    • Timeliness

    • Completeness

    • Precision

    • Confidentiality

  • DQA tool is used to assess the quality of the data and should be responsive to meeting the seven dimensions

  • Assessment helps us to determine areas of poor data quality & help come up with action plans for potential solutions.

  • Objectives of RDQA implemented

    • VERIFY the quality of reported data for key indicators at selected sites

    • ASSESS the ability of data-management systems to collect, manage and report quality data.

    • IMPLEMENT measures with appropriate action plans for strengthening the data management and reporting system and improving data quality.

    • MONITOR capacity improvements and performance of the data management and reporting system to produce quality data.

    Uses of RDQA implemented

    • Routine data quality checks as part of on-going supervision

    • Initial and follow-up assessments of data management and reporting systems – measure performance improvement over time

    • Strengthening program staff’s capacity in data management and reporting

    • External assessment by partners and other stakeholders

    M&E Unit implemented

    Intermediate Aggregation Levels (e.g. LGAs, States)

    Service Points

    Conceptual Framework of DQA

    Generally, the quality of reported data is dependent on the underlying data management and reporting systems; stronger systems should produce better quality data.

    Dimensions of Quality


    Accuracy, Completeness, Reliability, Timeliness, Confidentiality, Precision, Integrity

    Functional Components of a Data Management System Needed to Ensure Data Quality

    Data-Management and Reporting System


    M&E Structure, Functions and Capabilities


    Indicator Definitions and Reporting Guidelines


    Data-collection and Reporting Forms / Tools



    Data Management Processes


    Links with National Reporting System


    Data management processes



    Data quality mechanisms and controls


    Links with the national reporting system

    RDQA Methodology: Chronology and Steps implemented

    PHASE 1

    PHASE 2

    PHASE 3

    PHASE 4





    • 1. Determine scope of the DQA

    • 4. Assess data management system

    • 6. Summarize findings and prepare action plan

    • 7. Implement activities and follow up

    2. Determine indicators, data sources and time period

    • 5. Verify data

    • 3. Determine and notify facilities/sites

    Implementation is conducted at M&E Unit, service sites and intermediate aggregation levels, as appropriate, given the scope of the DQA

    1 implemented

    Data Verifications

    (Protocol 1)

    Quantitative comparison of recounted to reported data and review of timeliness, completeness and availability of reports.


    Assessment of Data Management Systems

    (Protocol 2)

    Qualitative assessment of the strengths and weaknesses of the data-collection and reporting system.

    RDQA Methodology: Protocols

    • The methodology for the DQA includes two (2) protocols:

    RDQA Methodology: Protocols implemented

    • Data Verification

      • Documentation Review

      • Recounted results – trace and verify

      • Cross checks – compare with alternative data sources

    • Reporting Performance

      • Timeliness, completeness, availability (Intermediate level and higher)

    • System Assessment

      • Are elements in place to ensure quality reporting?

    RDQA Methodology: Data Verification Component implemented

    Service Delivery Sites / Organizations

    Intermediate Aggregation levels

    (eg. District, Region)

    M&E Management Unit

    Trace and verify Indicator Data

    Trace and Verify Reported Results

    • PURPOSE: Assess on a limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting and reporting data accurately and on time.

    • The data verification step takes place in two stages:

      • In-depth verifications at the Service Delivery Points; and

      • Follow-up verifications at the Intermediate Aggregation Levels (Districts, Regions) and at the M&E Unit.

    M&E Unit/National implemented

    Monthly Report



    District 2

    District 3

    District 1

    Monthly Report

    Monthly Report

    Monthly Report

    District 4

    Monthly Report

    SDS 3

    SDS 4

    District 2


    District 4

    SDS 1

    SDP 5

    SDS 2

    District 1



    SDP 6

    District 3

















    Service Delivery Site 1

    Service Delivery Site 4

    Service Delivery Site 6

    Monthly Report

    Service Delivery Site 2

    Service Delivery Site 5

    Service Delivery Site 3

    Monthly Report

    Monthly Report

    ARV Nb.


    Monthly Report

    Monthly Report

    Monthly Report

    ARV Nb.


    ARV Nb.


    ARV Nb.

    ARV Nb.

    ARV Nb.




    Source Document 1

    Source Document 1

    Source Document 1

    Source Document 1

    Source Document 1

    Source Document 1

    RDQA Methodology: Data Verification

    RDQA Methodology: Systems Assessment Component implemented

    M&E Management Unit

    Service Delivery Sites / Organizations

    Intermediate Aggregation levels

    (eg. District, Region)

    Assessment of Data Management and Reporting Systems

    Assess Data Management and Reporting Systems

    • PURPOSE: Identify potential risks to data quality created by the data- management and reporting systems at:

      • the M&E Management Unit;

      • the Service Delivery Points;

      • any Intermediary Aggregation Level (District or Region).

  • The RDQA assesses both (1) the design; and (2) the implementation of the data-management and reporting systems.

  • The assessment covers 8 functional areas (HR, Training, Data Management Processes , etc.)

  • RDQA Outputs implemented

    1- Strength of the M&E System,evaluationbased on a review of the Program/project’s data management and reporting system, including responses to overall summary questions on how well the system is designed and implemented;

    2-Verification Factorsgenerated from the trace and verify recounting exercise performed on primary records and/or aggregated reports (i.e. the ratio of the recounted value of the indicator to the reported value);

    3- Available, On time and Complete Reports percentages calculated at the Intermediate Aggregation Level and the M&E Unit).

    4-Action Plan for System Strengthening for each level assessed.

    ILLUSTRATION implemented

    Example of Systems’ Finding at the M&E Unit (HIV/AIDS)

    Thank you
    Thank you… implemented

    MEASURE Evaluation is a MEASURE project funded by the U.S. Agency for International Development and implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. Views expressed in this presentation do not necessarily reflect the views of USAID or the U.S. Government. MEASURE Evaluation is the USAID Global Health Bureau's primary vehicle for supporting improvements in monitoring and evaluation in population, health and nutrition worldwide.

    Visit us online at