1 / 25

Implementation of Medidata Rave at NCIC

Implementation of Medidata Rave at NCIC. Lam Pho – IT Director Dora Nomikos – Trial Management Team Leader August 3, 2010. Outline of Presentation. Medidata Rave in NCIC CTG Integrations Key considerations for Rave Implementation Lessons Learned. Where is NCIC CTG with Rave?.

edolie
Download Presentation

Implementation of Medidata Rave at NCIC

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Implementation of Medidata Rave at NCIC Lam Pho – IT Director Dora Nomikos – Trial Management Team Leader August 3, 2010

  2. Outline of Presentation • Medidata Rave in NCIC CTG • Integrations • Key considerations for Rave Implementation • Lessons Learned

  3. Where is NCIC CTG with Rave? • Had some experience with EDC/CDMS i.e. OC/RDC • Reviewed/assessed various EDC/CDMS systems since 2003 • Selected Medidata Worldwide Solution Inc. (Rave) as CDMS provider in late 2006 • Rolled out first Rave trial in late 2007 • 18 trials are currently in Rave (9 phase III) • Some phase III trials are international trials • As of July 27, 2010 • 1397 patients accrued, 184 sites and 861 users across Europe, Australia, South America, Asia and Canada • MA32 recently activated (a 3,582 patient breast trial lead by NCIC CTG through NCI/CTEP) • Medidata Rave version 5.6.1 (upgrading to 5.6.3 soon)

  4. Key considerations for selecting EDC system • Meet NCIC CTG Functional and Non-functional requirements • Validated System • Hosting • Technology transfer • User acceptance • Training • User support and Technical support • Cost • Support for standards • Integrations • Vendor’s profile i.e. financially stable, key personnel, etc

  5. NCIC CTG Personnel involved in EDC • EDC Executive Committee (replaced by CODE) • Group Director, Trial Management Group Manager, QA Manager, IT Director • Oversees EDC development, financial/resources commitments, group processes, etc • EDC Design Committee • Phase I/II/II Study Coordinators (SCs), Sr. Biostat, Project Coordinator (PC), QA, ER, AMG/Monitor, IT staff (developer and SAS programmer) • Deals with eCRF design issues i.e. cycle based vs. running log • Develops generic e-CRF templates for the standard modules: AEs, Hematology, etc • EDC Validation Committee • SC, Sr. Biostat, PC, QA, Monitor, and IT staff. • Responsible for checking for the completeness of the design/layout, content, associated edit checks for NCIC CTG generic data collection templates designed by the EDC design committee. • EDC Issues Committee (once every 2 weeks) • EDC-SCs, EDC Trial Team SCs, IT, AMG

  6. NCIC CTG Personnel involved in EDC • Rave Developers • EDC Study Coordinators • Knowledge of EDC principles/procedures and eCRFs development • Work collaboratively with trial team • certified Medidata trainers • Training users • Others: IT Systems staff, CRAs from sites, Biostatistician, QA, Operations, Audit-Monitoring, HelpDesk Support, etc. • 2009: Central office data management executive (CODE) created with responsibility for data management processes including data capture in EDC • A lot of tasks need to be done and ready before study activation for EDC trials compared to paper based trials

  7. Rave training for internal staff • NCIC CTG staff was trained on the following courses that were based on the knowledge transfer plan: • “Rave 5.5 Accelerate” course: overview of the whole system and a detailed look into the Architect module of the system, which is where the project is designed, built, and implemented • “Custom Functions” course: how to write custom edit checks and calculations in C# programming language • “Lab Administration” course: how to integrate lab data into administration module into EDC trials • “Output/Extract” course: how to export data from Rave • “EDC Essentials/Train-the-Trainer” course: several staff were trained by Medidata to be “Certified Trainers”. The successful candidates are allowed to train other users how to use Rave • Additional training on new features

  8. Tasks done/used in EDC at NCIC CTG • Integration with other systems (patient registration/randomization, CTG membership roster, etc.) • Data entry screens • Real-time data checking (system queries) • Data verification • Data review by various roles within the system • Manual queries issued by various roles within the system • Data lock • Lab Administration (local and central lab data) • Custom Functions in C# • Global Library of generic eCRF templates with associated edit checks • Data extraction from Rave into SAS and Oracle • Four environments: Development, UAT, Training, Production

  9. Tasks done/used in EDC at NCIC CTG • SAE reporting within Rave • Enter SAEs into the same system as all other patient data • Email notifications of new/updated SAEs • Ability to generate reports in various formats e.g. CIOMS, PDF • Integrates with internal SAE Reporting utility (for ease of PC review) • On-line Help Desk Support System: eTickets, FAQs, etc • Integrated eLearning modules within Rave: DM, Monitor, Investigator, CRC, Browse • Training: • Face-to-Face training: CRAs, EORTC, TROG, Japan, etc. • Webcast • eLearning modules • Training materials: guide books (generic and study specific)

  10. User Feedback • Training • Feedback differs • Common Questions: • Ease of data entry, very user-friendly interface • Timeliness • Trial specific questions (ex: randomizations) • System requirements • Some questions  FAQs on Helpdesk Support System

  11. Current EDC Integrations • Internal Integrations • Real-time Randomizations via Mango Patient Randomization System • Real-time data checking via Oracle RDBMS • Real-time data population into Oracle RDBMS • Real-time SAE entry, notification, CIOMS generation and posting into E-Assessment web tool • Nightly data dump to Oracle RDBMS which integrates with administrative tools • Nightly SAS export • NCIC RSS data

  12. Patient Randomizations and Treatment System (Mango) Integration 1 2 MANGO (Oracle) (behind the scenes) Rave (SQL Server) (where user logs in) Participants Lists and Roster - Eligibility Checklist (EC) Ethics Database - EC Data Checks Randomization Algorithm - Stratification Outcome: Pass: Display Arm Allocation and Pt. Serial #. Rave then allows users to access post randomization folders Fail: fail message; not allowed to randomize

  13. CTSU RSS Integration RSS Data RSS Transactions sent in XML format to CTG Process inserts/updates CTG version of RSS database. CTG RSS Data EDC Mango CTG Participant Database

  14. SAE Integration CIOMS Report SAE entered into EDC SAE Notification Emails Snapshot stored at CTG in XML format Web based E-Assessment System

  15. EDC System Mail Oracle SAS EDC Data Flow SITE NCIC CTG CENTRAL OFFICE DATA REVIEW: - on screen review (tracked) DATA CLEANING: - manually issue e-queries - process query responses MONITORING: - on screen verification - manually issue e-queries Audit Trail Reports Task Summary Randomization Arm Allocation Data Entry including SAE Comments System queries Query resolution Data corrections Audit Trail Reports Task Summary Final Analysis Data Lock Data Sets (final) Partners Supporting Documents Paper CRFs (e.g. QOL) Logging Data entry (paper) Data Review Data Cleaning - queries (email) - response / data corrections Data Sets (intermitent) Data Checking SAS data checking results

  16. Future EDC Integrations • External Integrations • CTSU RSS data directly • CTSU Open/Rave integration • Real-time integration between Rave SAE/AdEERs • Via CTEP AdEERs API/web services • XML transactions • Had a couple of teleconference calls with CTEP AdEERS IT staff

  17. Data Flow for CTSU Registrars using MA32 Rave/WEC US Site Registrar EDC/Rave WEC and all CRFs in Rave Rave/Mango XML v.2 Response XML v.2 Confirmation of enrollment via email Mango Rand'n, policy enforcement, etc PHP @CTG (PHP/Apache) Logging/Patient Common Summary Enrolment tables pt_allocations, etc Mango WEC (pt_wec)

  18. Data Flow for US Site Registrars using OPEN on EDC Trials OPEN Java @CTSU Data Transfer over Java Objects XML CDISC data • Registration/Admin data • ODM Data • Metadata version • Clinical ODM data RandoNode Java @ CTG (scooby J2EE Glassfish) US Site Registrar Rave/Mango XML v.2 Response XML v.2 Confirmation of enrollment via email Mango Rand'n, policy enforcement, etc PHP @CTG (PHP/Apache) Logging/Patient Common Summary Enrolment tables pt_allocations, etc Mango WEC (pt_wec) ODM/Rave load process Trial/amendment specific custom programming needed. EDC/Rave New patient created in Rave EC developed in Rave identical to OPEN, all edit checks duplicated. ODM XML table: trial_cd, trial_stg, Pt_ID, ODM_XML Trial Metadata loaded on each protocol amendment

  19. Key considerations for Rave Implementation • Establish EDC Committees e.g. Executive committee or Core team, steering committee • Establish sub-teams or sub-committees: • Process; Integrations; eCFR Design, Reporting, etc. • Define Key Deliverables for each sub-committee • Reports to Executive committee or Core team

  20. Key considerations for Rave Implementation • Hosting: Medidata; NCI; In-house • Study Build: Medidata; 3rd Party; In-house • Establish Knowledge Transfer Plan with Medidata • Determine resources required e.g. personnel • Determine systems required: # of studies, duration of the studies, sample size, sites, users, eCRFs, etc. • Develop eCRF templates, EDC processes/SOPs/WKIs. • Integrations required • Method of generating datasets for analysis: SAS on Demand, Medidata create dataset, In-house SAS conversion, etc • Reporting: Business Objects, J Review, SAS on Demand, Others (SAS Enterprise, In-house Data Warehouse) • Site Training: eLearning; Face-to-Face (Investigator Meetings), Webex, combinations of eLearning/face-to-face, etc • Support Models: Level of support, In-house/outsource

  21. Lessons Learned • Build study in Rave before crystallizing processes, standards, infrastructure for tools/integrations • Set “realistic” timeline for rolling out 1st trial - eight months for NCIC • Avoid very large and complex 1st trial • Made changes to some of the processes/procedures along the way as we learned more about Rave and how it fits in with our own environment • “Specialized” personnel: developers, E-Study Coordinators • Take end-users’ input/suggestions seriously i.e. no double data entry and have forums for them to express their opinions i.e. annual meeting, help-desk support system

  22. Lessons Learned • Where to use Rave provided features vs. where to integrate Rave with existing systems in group e.g. forms overdue, query reminders, etc. • eCRF templates: light/heavy vs. all inclusive • Standard edit checks/custom functions for each eCRF template but allow to add study specific ones • Groups should try to use the same set up for workflows, interface, etc. where we can • Copy similar trial vs. using global eCRF templates • Utilize Medidata technical support e.g. bugs • Credit users with previous Medidata Rave training from other organizations i.e. case by case depending on Rave version trained and proof of training

  23. Lessons Learned • Migration: time consuming, site by site based on IRB/REB approval, site groups • User and Site Administration: time-consuming, manually. Make sure to allow adequate resource/personnel • Real-time data updates (web services) vs. data dump • Clinical data for analysis vs. Non-Clinical data • Training knowledge on one trial transferable to other trials as long as the study built similarly e.g. randomization process, SAE reporting, etc

  24. Lessons Learned • Generic guidebook vs. supplement study specific slides/guidebook • Manual queries by data point work well but “Tricky” to issue manual queries on multiple data fields/missing data • Avoid open-ended queries • Communication tools outside of Rave • Mock eCRFs • Inventory spreadsheet • NCIC web-based report distribution (forms overdue, query reminder, etc) which utilizes NCIC rostering (Rave and Non-Rave users): email with links to reports.

  25. Lessons Learned • Access for “other” groups/parties/roles – affect how you set up user groups, roles, marking groups, etc • External monitors vs In-house monitors • Other group rostering: AGITG , EORTC, etc • Real-time reports to partners e.g. SAE

More Related