1 / 33

15-Nov-2012

A Framework for Leveraging Health Information Technology (HIT) in Clinical Trial Planning and Execution. Brendan O’Neill Director, Clinical Research. Otis Johnson Associate Director, Clinical Research. 15-Nov-2012. Outline. Learning objectives Business problem Response framework

jasia
Download Presentation

15-Nov-2012

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Framework for Leveraging Health Information Technology (HIT) in Clinical Trial Planning and Execution Brendan O’Neill Director, Clinical Research Otis Johnson Associate Director, Clinical Research 15-Nov-2012

  2. Outline • Learning objectives • Business problem • Response framework • HIT and EHR as part of the solution • Expectations and success indicators • Lessons learned from early EHR experience • Performance Measurement

  3. Learning Objectives • Learn how to: • Incorporate HIT in clinical trial planning and execution • Partner with IT to advance trial operations • Include EHR evaluation in standard study feasibility assessment process • Avoid common pitfalls when using EHR information • Measure performance against pre-established goals

  4. Patient Enrollment Period Extended Percent of sites extending patient enrollment period on average 66% of sites extend enrollment timeline 20-80% of the time Source: CenterWatch Investigative Global Site Survey 2011

  5. ASIA NA SA EU Why Do we Care? • Problem • Clinical trial recruitment often slow and unpredictable • Result • Unmet expectations • Delayed filings • Lost revenue • Damaged relationships Value of 1 day in drug development • $37,000 in operational cost • $1.1 M in prescription revenue Source: Gartner. 2007. Case study: Boosting the predictability of clinical trial performance.

  6. Traditional Response to Slow Enrollment • Typical response to under-enrollment • Add countries • Add sites • Revise protocol • Implement costly remediation programs • Recruitment campaigns; involve vendors • Rejuvenation workshops Disruptive Costly Not sustainable Did we set the right expectations? Can we accurately predict trial performance? Can we do a better job setting expectations?

  7. Opportunity for Improved Response • Root cause analysis • Insights into trial performance • Factors that impact trial performance • Organizational culture that supports innovation and sound change Key finding • Trials often designed without consideration of real world clinical data Can we leverage EHR in planning and execution of clinical trials?

  8. Response Framework • Standard feasibility assessment process • Incorporation of EHR analyses in study planning and execution • Allocation of study optimization resources • Enrollment modeling technology

  9. FTEs Dedicated to Addressing Problem • GTO (Global Trial Optimization) • Optimize clinical trial feasibility and execution through data-driven analyses • Scope and Objectives • Operational insight into trial design • Geographic input • Enrollment modeling • R & R strategies and tactics Data resources Analytical tools Technical expertise Data analyses

  10. Understanding Trial Performance- Factors Impacting Trial Enrollment Screening c Rate Site Failure Site Initiation Period Site Enrollment Capacity Screen Failure

  11. Extensive Research- Data Sources and Types • Public • Clinicaltrials.gov • Literature reports • Proprietary • CTMS • IRT/IVRS/IWRS • Purchased • Industry consortia: CMR, KMR, DecisionView • Aggregators: TrialTrove, SiteTrove, BioPharm Clinical, Clinical Trials Insight, etc.

  12. How We Gain Insights From Internal Data

  13. Early Assumption-Based Enrollment Model No. of Randomized Patients Months After Protocol Approval

  14. In-life Enrollment Modeling Screening and Randomization Planned Site Ready Planned Site Ready Actual & Projected Screening and Randomization Actual & Projected

  15. Understanding Trial Performance- Factors Impacting Trial Enrollment Screening c Rate Site Failure Site Initiation Period Site Enrollment Capacity Screen Failure

  16. Leveraging EMR for Secondary Use • Determine the value of EHR-enabled providers in facilitating improved clinical trial execution • Analysis of EHR to determine patient availability • Identification of provider-affiliated investigators • Randomization of patients Page | 16 Page | 16 Page | 16

  17. Data Types Desired for Clinical Research Disease Etiology Patient Histories HPI/DD* Physical Diagnosis Therapy Outcomes • Family • Illness • Social/psychological • Allergies • Medication • Economic • Exposure • Smoking/Alcohol • Diet, Etc. • Symptoms • Onset • Chronology • Location • Radiation • Severity • Duration • Context • Modifying factors (+/-) Etc. • Physical Exam • ICD Primary Diagnoses + comorbidity • Pharmacy • Labs • Imaging remarks Etc. • CPT(s) • Assessments • Allergy • Adverse events • Orders • Lab results • Rx dispensed • Radiology • Supplies Etc. • Patient/disease progress • Side effects • Adverse events • Outcomes • Compliance • Adherence • Quality of life, Etc. *HPI/DD = History of Present Illness / Differential Diagnosis

  18. When is EMR Analysis Needed? Feasibility Report Addendum, as needed Protocol Operational Feasibility to Team Input into Review Input Protocol Operational Feasibility Protocol Concept Available Initiate Vendors as needed

  19. Does EHR Have Data Needed?

  20. What Insights Can you Obtain from EHR Analysis?

  21. Potential Patients Identified in Last Year

  22. Reached 40 of 300 Potential Patients

  23. Success Measures Success Measures EHR-enabled sites vs. traditional sites Enrollment rate, enrollment duration, data quality, site startup dynamics, percent screen failure, retention, etc. Comparison to industry benchmarks Ability to flag enrollment challenges Expectations • Protocol Refinement • Investigator Identification • Patient Recruitment

  24. EHR Criteria Assessed: Protocol 1 Risk: Inaccurate conclusion • EHR does not always have all information needed for clinical trial decision making

  25. EHR vs. Actual Screen Failure Reasons Some parameters change over time

  26. Actual Screen Failures as a Proportion of all Patients Screened • May find what’s needed in certain disease areas • Know what you need

  27. Lessons Learned • EHR Analysis • Provides information on number of patients in EHR database who match specific inclusion and exclusion criteria • Limitations: • Can assess only criteria typically captured in medical records • Does not predict screen failure ratio for a trial • Unsure of the relationship of EHR database to prevalence in the general population. • Unsure of applicability of data on a subset of the US population to the rest of the world. • Study Initiation Process must be optimized in parallel • Accelerate budget and contract negotiations • Prenegotiate contract language and budget elements

  28. Results

  29. Enrollment Period PredictionGTO-Supported Trials (Enrollment Period) Correlation Coefficient = 0.8

  30. Enrollment Period Prediction Difference of 1 month

  31. Randomization RateTrials Achieving LPE Since 2009 Correlation Coefficient > 0.9

  32. Screen Failure – Prior Performance Study cancelled during enrollment period. Restrictive eligibility criteria listed as challenge. Protocol clarification letter needed. Had multiple amendments to facilitate better enrollment. For example, several incl/excl criteria removed. 50% of studies within variance

  33. Percent Screen Failure for Completed Trials 80% of studies within 10% of prediction, N = 25 trials

More Related