1 / 47

Postgraduate Course in Pharmaceutical Medicine Module 3: Clinical Development

The Role of the Statistician and the Data Manager in Clinical Trials Dr Richard Kay, Consultant , RK Statistics Dr Andy Richardson, Consultant, Zenetar Tuesday 7 th Feb 2012. Postgraduate Course in Pharmaceutical Medicine Module 3: Clinical Development. Statisticians’ Role.

arty
Download Presentation

Postgraduate Course in Pharmaceutical Medicine Module 3: Clinical Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Role of the Statistician and the Data Manager in Clinical Trials Dr Richard Kay, Consultant , RK Statistics Dr Andy Richardson, Consultant, Zenetar Tuesday 7th Feb 2012 Postgraduate Course in Pharmaceutical Medicine Module 3: Clinical Development

  2. Statisticians’ Role • Contribute expert input to the study design • Type of study (looking for differences or to show similarity?) • Sample size calculation • Dealing with complexity (multiplicity, interim analysis, missing data …) • Develop and undertake the statistical analysis of the data • Protocol statistical methods section • Statistical Analysis Plan • Statistical Report.... • Other • Support IDMCs • Papers/presentations.... • Regulatory questions often relate to statistical issues

  3. Planning and Setup • Statistical Issues in Design • Superiority, equivalence or non-inferiority • How many patients • Method of randomisation • Statistical methods for analysis • Dealing with multiplicity • Interim analysis

  4. Superiority, Equivalence or Non-Inferiority • Show that Drug A is better than Drug B or that Drug A works (compare to placebo) - Superiority • Demonstrate that Drug A and Drug B are ‘equally effective’ - Equivalence • Demonstrate that Drug A is ‘at least as good as’ Drug B - Non-Inferiority

  5. How many patients - Superiority • Level of significance (usually 5%) • Power required (ICH E9 recommends >80%) • Level of effect we are looking to detect (clinically relevant difference) • Patient to patient variation for continuous endpoints then out pops N!

  6. Method of Randomisation • Simple randomisation • Block randomisation – block size • Stratified randomisation • how many factors • which factors and at what levels

  7. Statistical Methods for Analysis • Endpoint type • continuous • binary • ordinal • time to event • Adjustment for baseline risk factors (covariates) – analysis of covariance • Superiority, Equivalence or Non-inferiority (methods of analysis very different) • Sensitivity analyses

  8. Dealing with Multiplicity • Single primary endpoint or several • Multiple dose groups • Implications for multiple testing and adjustment of α • Avoiding adjustment • Composite endpoints • Hierarchical testing • Strategy for secondary endpoints

  9. Interim Analysis • Offers opportunity to stop early for several reasons • Overwhelming efficacy • Futility • Pre-planning required + careful management of unblinding • Impacts on α

  10. Statistical Methods Section of the Protocol • What should it contain? • Justification of sample size • Primary and secondary endpoints – clear delineation • Analysis datasets (intention to treat / per protocol) • Handling missing data • Methods of statistical analysis of the primary endpoint in detail (including adjustments for covariates); broad outline of methods for secondary endpoints • Dealing with multiplicity • Presentation of safety data • Methods of interim analysis

  11. Role of the Data • The data is the critical asset • Critical asset – for the study, for the company, for the regulator • Accuracy and Confidence • Must meet study objectives • Must comply with regulatory requirements • Observations must be accurate and confirmed • Data is required to support • Observations/measurements of the trial • Conduct of the trial

  12. Role of the Data • Observations/measurements of the trial • Planned : e.g. Research/study results • Not Planned: e.g. safety data (event ) • Conduct of the trial • Data to confirm that the observational data has been collected and processed consistently • Data to confirm who operated on the data, and when • Requirement to ‘recreate’ the study

  13. FDA ‘ALCOA’ Principles • Attributable • Data records must indicate who recorded the information • Data must remain under investigator control • Legible • Must include subject data, meta data and audit trails • Available in human readable form • Contemporaneous • Recording as close to observation/measurement as reasonably possible • Audit trails to provide evidence of timing • Original • Original data or accurate transcription of original • Accurate • Data must remain unaltered when saved to the database

  14. Data Managers’ Role • Three Data Management ‘C’s • Collect the data • Collate the data • Confirm the data • Role of CDM • …. to organise and ensure the collection of accurate data from the trial, to capture the data on a database, to validate and correct the data, and to provide ‘clean’ data to the statistician in a form that will facilitate the statistical analysis • Principles of Clinical Research

  15. Phases of the Clinical Trial Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  16. Planning and Setup Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  17. CDM Role and Responsibilities • Key Tasks • Design & build study specific database/data collection tools and supporting technologies • Review and resolve inconsistencies in the data • Confirm data related compliance requirements • Manage and maintain the CDM technical infrastructure

  18. Data Management • Design & build a study specific database/data collection tools and supporting technologies • Specify and develop the required study data review methods and resolution methods • Protocol (Study Flowchart) • CRF • Database

  19. Operational Implementation Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  20. Operational Implementation • Study database and process validation • Test data entry, checks, exports etc. • Confirm processes (SOPs, alerts etc) • Document (Study Validation Plan) • Load/enter study static data • Site information • Dictionary/extended codelists • Randomisation list • Authorise Users • Study/Project team • Investigators

  21. Study Execution Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  22. Data Entry Process • Receipt and management of study documents • CRFs, DCFs, etc. • Populating the clinical trial database • Entry by forms, loading data • Identification of some types of inconsistencies • Correction of some types of errors • Confirmation of accuracy of the transcription • Double data entry • Entry and blind verification • Entry and interactive verification • Single entry with review • Single entry without review

  23. CDM Basic Process

  24. Data Review Process • Reviewing / checking the data for consistency and accuracy • Checks for • Variables of the correct type • Variables contain valid values • Data are reasonable (within ranges) • Eliminate duplicate values • Confirmation or resolution of missing data • Uniqueness of key variables • Chronology is consistent • Implied or explicit logic is consistent • Implied or explicit existence of related data is confirmed

  25. CDM Basic Process

  26. Data Review Process • Inconsistencies are documented, resolved or confirmed • Data Clarification Forms • Document the inconsistency • Request confirmation or resolution • May suggest resolution • Investigator receives and responds to DCF • CRF plus associated DCFs represent the final state of the study data in the database.

  27. Data Coding • Categorising data for • Use in subsequent analysis (Yes=1, No=0) • Consistency with specialist lists (e.g. microbiology) • Resolving variably reported original data into a standard form • Coding using Expert terms • Dictionaries for medical history, adverse events, medications • MedDRA, WHO Drug Dictionary, ICD-9/10, etc.

  28. Adverse Event Reconciliation • Study CRF/database will contain • (n) Adverse Events, of which (m, m<n) events will have been reported under expedited reporting rules (ie: Serious Adverse Events) • Serious Adverse Events are reported immediately, managed by a pharmacovigilance group, and are stored in specific pharmacovigilance systems, and in the study database. • Study AE will be documented on the CRF, stored in the study database, and are only available to the study team some time after the event.

  29. Adverse Event Reconciliation • Study AE database and the Serious AE database must be consistent • Number of SAEs, • Subject • Reported Event • Duration (Start Date, End Date) • Outcome etc. • Relationship to study drug • Typical Method • Compare all data in each database for each SAE, resolve all inconsistencies

  30. Close, Archive and Reporting Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  31. Database Close & Locking Process • Closing a study database • Confirmation all data originally collected is present • Confirmation that the ‘data quality’ is acceptable (error rate) • Confirmation that all data has been subject to the documented procedures • Report on the omissions, inconsistencies, issues discovered or remaining in the data • Removing access to prevent further data modification • Unlocking a study database • Formal process to enable post-lock (blind broken) data modifications to occur

  32. Regulatory Framework & Infrastructure Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  33. Regulatory Framework & Infrastructure • DATA RELATED • ICH Good Clinical Practice • E6: Good Clinical Practice • E9: Statistical Principles • TECHNOLOGY • Computer Systems Validation • Electronic Signatures • CFR Title 21 & Guidance for Industry • PRIVACY • Patient Identifiable Data • Data Protection Regulations

  34. Computer Systems Validation • Definition • "Confirmation by examination and provision of objective evidence that software specifications conform to user needs and intended uses, and that the particular requirements implemented  through the software can be consistently fulfilled". • FDA General Principles of Software Validation: Final guidance for Industry and FDA Staff • Principles • Specify intended use and user requirements, make sure and verify that the software meets the requirements through proper design, implementation and testing and maintain proper use through an on-going performance program

  35. Computer Systems Validation • Installation Qualification • ‘Is the software application properly installed, are all its physical and logical requirements met, and is its platform adequately configured for user access in the work process’ • Operational Qualification • ‘Does the software application work as intended just above, just below, and at the operational limits set in its design specification’ • Performance Qualification • ‘Does the software application system perform as intended to meet user requirements specifications in a simulated work process environment’

  36. Andy Richardson andy.richardson@zenetar.com Practical Guide to Clinical Data Management Susanne Prokscha http://www.crcpress.com

  37. Statistical Review & Reporting Infrastructure & Regulatory Framework Project Management Planning & Setup Study Close & Archive Study Execution, Data Collection & Review Operational Implementation & Configuration Human Resources & Training

  38. Statistical Analysis Plan • Expansion of the statistical methods section of the protocol • Precise detail on the analysis and presentation of the data • Table shells

  39. Analysing the Data and Reporting • Pre-programming and dry-runs • Blind Review (before breaking the randomisation code) • Choice of analysis sets, amount of missing data, how to deal with small centres, …

  40. IDMCs • Increasingly used for monitoring trials in an ongoing way - primarily in long term mortality trials • Enables sponsor to remain blind • Look at safety in an ongoing way • Look at unblinded interim analyses • Make recommendations on the basis of these ongoing data/results • Board consists of ≥ 3 members, one must be a statistician

  41. Analysing the Data and Reporting • Clean database / database lock • Pre-analysis data checks • Subject Numbers • Missing data • Attach randomisation code • Complete analysis and outputs (analyses, tables, figures and listings) • Work with Medical and Medical Writing on Integrated Report

  42. Meta Analysis • Formal ways of bringing together the results of several trials • Sometimes used within a regulatory submission • Basis for choice of margin (delta) in non-inferiority trials • Marketing context

  43. Presenting Results • Publications; still many mistakes and bad practice • CONSORT statement (see eg Begg, Cho et al (1996) JAMA) • Conference presentations and posters; frequent horror stories! – don’t get bad press because of this • Involve a statistician

  44. Regulatory Submissions - Guidelines • ICH, E9: ‘Statistical Principles for Clinical Trials’ • ICH, E10: ‘Choice of Control Group in Clinical Trials’ • Points to Consider papers (EU) • Therapeutic area specific Guidelines

  45. Regulatory Submissions - EU Points to Consider • Clarify and expand on issues raised in ICH E9 • Adjustment for Baseline Covariates • Application with 1) Meta Analysis; 2) One Pivotal Trial • Switching between Superiority and Non-Inferiority • Multiplicity Issues in Clinical Trials • Missing Data • Choice of Non-Inferiority Margin • Data Monitoring Committees • Confirmatory Clinical Trials with Flexible Design and Analysis Plan

  46. Statistical Thinking for Non-Statisticians in Drug RegulationBy Richard Kay ISBN 9780470319710 RRP £42.50 With 10% discount £38.25 Quote promotion code: VA259 Order online at www.wiley.com Richard Kay richard.kay@rkstatistics.com

  47. Questions

More Related