1 / 32

Data Management Metrics: The Bar to Which We Measure All of Our Studies

Data Management Metrics: The Bar to Which We Measure All of Our Studies. Donna Gugger Associate Director, Clinical Data Management United BioSource Corporation 17 Blacksmith Road Newtown, PA 18940 www.unitedbiosource.com. Operational and Study Quality Metrics.

curt
Download Presentation

Data Management Metrics: The Bar to Which We Measure All of Our Studies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Data Management Metrics: The Bar to Which We Measure All of Our Studies Donna Gugger Associate Director, Clinical Data Management United BioSource Corporation 17 Blacksmith Road Newtown, PA 18940 www.unitedbiosource.com

  2. Operational and Study Quality Metrics • Metrics can be broken down into two General Categories: • Operational Metrics • Monitor Internal processes • Date of Receipt -> Data Entry • Data Entered -> Reviewed • Reviewed -> Clean • Page counts • Status updates • Projections • Universal across studies • Study Quality Metrics • Meeting critical study objectives • Identify potential problems and allow for corrective action • Project specific plan

  3. Operational and Study Quality Metrics • What Type of Metric Reports are Needed? • Determined by a collaboration between Clinical, Data Management and Statistics • Status updates • Identify critical study objectives • Ensure objectives are measureable and actionable • Define what data are available to produce metric reports • Ensure that interpretation is consistent

  4. Operational and Study Quality Metrics • Why Do We Utilize Metric Reporting? • Provide information regarding cost, time, quantity and resource management • Identify data quality issues • Ensure the study is progressing as intended • Provide tools for monitoring • Provide insight to ensure studies are on target in meeting critical objectives • Allows for early detection of potential problems

  5. Operational and Study Quality Metrics • Criteria for Effective Metrics • Generate metric reports that add value to study team objectives • Focus on producing a limited set of reports, each containing targeted information • Include dates of sources in report headings/footer • Determine frequency based on study team needs and data availability

  6. Operational Metrics vs Study Quality Metrics • Overlap in Operational and Study Quality Metrics (SQMs) Operational SQMs Data Quality Issues Trends Training Corrective Action Monitor internal processes Status:Site/Patient Status Page/Query Counts Meeting Study Objectives? Study Progress Project Specific

  7. Operational Metrics • Operational Metrics • Provide information regarding cost, time, quantity and resource management • Data management process metrics • Performance e.g., timeliness of data entry • Data quality • Identify sites that may require additional training regarding particular query types • Study status – overall and detailed reports • Pages received, entered, reviewed and clean • Missing page reports • Queries open, closed • Query aging reports can improve query response times • Visit projections • Assists CRAs in preparation for monitoring visits • Project visits completed based on enrollment

  8. Study Quality Metrics • Study Quality Metrics • Provide insight to ensure studies are on target in meeting critical objectives • To achieve key study endpoints • Site performance issues • Identifying errors • Determine underlying causes • Take corrective action • Provide additional site/monitoring training • Obtain buy-in from entire project team • Develop and approve SQM Plan as part of study start-up activities • Keeps focus on study quality

  9. Data Management Metrics • Utilize available data based on specific study needs • Define Metrics to be used • Consider available data sources • Clinical (CRF) database • Electronic data (e.g., IVR, laboratory, ECG, imaging, diary data) • EDC • Allows for real-time reporting provided the data are entered on an ongoing basis and in a timely manner • Limit reports to those that are meaningful and will be used • Standard Definitions • Agreed upon by study team prior to implementation • Meaningful and actionable • Determine frequency of reports • Effective reporting takes into consideration processing time, workflow and timelines • Quarterly, monthly, weekly • Automate as much as possible

  10. Operational Metrics – Query Aging Report • Tool to assist monitors with timely query resolution • By Site, Country/Region or Overall Study • Identify sites delinquent with query resolution

  11. Operational Metrics – Outstanding CRF Page Report • Tool to assist monitors with data retrieval • Assurance that pages are retrieved prior to interim analysis or database lock

  12. Operational Metrics – Data Collection Report • Sample Data Collection Report • Project visits expected • Identify outstanding visits/pages • Utilize available electronic data

  13. Operational Metrics – Data Collection Report • Sample Data Collection Report

  14. Operational Metrics – Patient Summary • Track Study Progress • By site and overall

  15. Operational Metrics – Detailed Patient Summary Date of CRF Data: 19Jul2009 Date of IVRS data: 12Jul2009

  16. Operational Metrics – Outstanding Query Reports • Detailed Outstanding Query Report • Tool for monitors to assist sites in the completion as well as timeliness in returning queries. • Monitors can filter the spreadsheet for their own sites and know how many queries are outstanding, how long and the actual query text. • Query Trend Analysis • Identify common data issues across sites or for specific sites • Unique edit check identifier links common queries

  17. Operational Metrics – Outstanding Query Reports • Detailed Outstanding Query Report, cont.

  18. Study Quality Metrics Focus on Study Data Quality Identify and Agree Upon “Key Success Factors” Which Will Optimize the Quality of the Trial Data Establish target goals to optimize overall study quality Joint effort between Data Management, Statistics, Clinical, Sponsor SQM Plan developed based on critical success criteria which may include the following: Inclusion/exclusion Study schedule Key safety and efficacy endpoints Transparency and ongoing demonstration of study quality fosters trust of alignment Decisions made at study start-up Discourages “blame sessions” when issues arise

  19. Study Quality Metrics The Reports Can be Generated as a Cumulative Display Overall or by Site Taking into Consideration Blinded or Unblinded Study Site displays can be blinded so as not to identify the specific investigator name or location Allows for strategic focused management of only those sites with potential issues Safety and efficacy reports Safety – can be used across studies e.g., Adverse Event reporting Efficacy – focus on study specific endpoints e.g., Progression Free Survival (PFS) as primary endpoint

  20. Study Quality Metrics Strategy Develop Meaningful Metric Data that are measurable & actionable Determine goal for each metric Acceptable variability Alert levels – when to take action Data needs to be available & current to make the metrics useful Operational reports may provide supportive information to determine if SQMs are needed Missing pages EDC – sites not entering data in a timely fashion Determine underlying problem Lack of understanding of protocol specifics Compliance with PK data collection – Site not collecting blood samples at required time points Lack of site/monitoring training

  21. Study Quality Metrics Strategy Implement Corrective Action Described and approved in the SQM Plan Head off potential problems by minimizing impact on study Examples: Patients enrolled do not meet intended study population criteria Laboratory results not recorded as Adverse Events Toxicity findings Subjects not hydrated properly at certain sites presenting high incidence of renal toxicity events Reporting use of disallowed concomitant therapies Misinterpretation of CRF completion guidelines/protocol Corrective action: Review site performance for continued participation

  22. Study Quality Metrics – Scenario 1 - AE Reporting Adverse Event Reporting Unequal distribution of reporting adverse events across study sites Can be over or under reported on a per site basis Relative to the overall study performance Impact on Study Inaccurate safety profile Data quality

  23. Are the Sites Reporting Safety Events Appropriately? Scenario 1

  24. Study Quality Metrics – Scenario 1 - AE Reporting Identify Problem 6 AEs per patient are being reported across all sites Under-reporting of AEs at Site 4? Over-reporting of AEs at Site 5? Determine Root Cause e.g., Are sites using the same script for eliciting AEs? Open-ended questions (Any health problems in the past week?) Targeted questions (Did you feel fatigue or nausea in the past week?) Corrective Action – Training opportunity Re-visit CRF Completion guidelines Review AE reporting requirements

  25. Study Quality Metrics – Scenario 2 - Primary Endpoint Assessment Accuracy of Assessment of Primary Endpoint Applicable to Time-to-Event endpoints such as Progression-Free Survival (PFS) Assessments of disease progression do not occur as scheduled Assessments required every 2 treatment cycles Impact on Study: Artificially increase PFS estimates Could result in a biased treatment effect on PFS

  26. Are We Accurately Assessing the Primary Endpoint?

  27. Study Quality Metrics – Scenario 2 - Primary Endpoint Assessment Identify problem At most sites, at least 90% of patients have primary endpoint assessments performed within a week of their scheduled visits Only 50% of patients at Site 1 have primary assessments performed within a week of their scheduled visits Determine Root Cause More patients unable to tolerate treatment regimen? Operational problems in obtaining assessments on schedule? Patients reside far from the study site Site has other competing studies running concurrently Corrective Action – Training opportunity Additional site visits may be needed Review existing procedures for primary endpoint assessment Identify potential tolerability issues not previously reported Provide patient transportation assistance

  28. Study Quality Metrics – Scenario 3 - Patient Population • Protocol-defined Patient Population • Are we enrolling the right patients into the study? • Key eligibility criteria identified ahead of time and agreed to by all stakeholders, i.e.: • Baseline ECOG score of 0 or 1 • Remission following 1st line chemotherapy for >6 but <18 months • Impact on Study • Enrolled patients do not reflect the study population as specified in the protocol • Treatment effect may be diluted in the overall population of interest

  29. Are We Enrolling the Right Patients?

  30. Study Quality Metrics – Scenario 3 - Patient Population Identify problem At most sites, at least 90% of patients are meeting key eligibility criteria for enrollment Only 50% of patients at Site 3 are meeting key eligibility criteria for enrollment Determine Root Cause Possible misinterpretation of inclusion/exclusion criteria? Pressure to meet pre-study targets for enrollment? Corrective Action – Training opportunity Review specific inclusion/exclusion criteria Reiterate the importance of enrolling patients representing the population of interest

  31. Metric Reporting – Additional Considerations Implementation Responsibility Collaboration of Statistics, Programming and Data Management Determine the level of validation needed for metric reports Metric Report Distribution: Distribution list Determine delivery method e.g., study portal, e-mail, J-Review via web Frequency of distribution will be defined according to individual metric report and study needs

  32. Data Management Metrics: The Bar to Which We Measure All of Our Studies - Conclusion Metrics provide useful tools to optimize data quality, processes and to ensure that critical study endpoints are met Metric standards must be define at study start-up and be agreed upon by study team and prior to implementation Meaningful and useful Implementation and frequency of reporting should be such that enough time is allowed for corrective action to be taken

More Related