1 / 10

CTSA Clinical Research Management Workshop - Debrief

CTSA Clinical Research Management Workshop - Debrief. Michele Ehlman. CTSA.

carl
Download Presentation

CTSA Clinical Research Management Workshop - Debrief

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CTSA Clinical Research Management Workshop - Debrief Michele Ehlman

  2. CTSA • A national consortium of medical research institutions, funded through Clinical and Translational Science Awards (CTSA), is working together to improve the way biomedical research is conducted nationwide. Consortium members share a common vision to reduce the time it takes for laboratory discoveries to become treatments for patients, to engage communities in clinical research efforts and to train clinical and translational researchers. • The momentum behind the CTSA program continues to build as new connections are emerging within, across and beyond the consortium. Launched in 2006, it now includes 39 medical research institutions in 23 states. When the program is fully implemented, it will support approximately 60 CTSAs across the nation.

  3. Links • Link to calendar of events http://www.ctsaweb.org/index.cfm?fuseaction=event.showCommAct& • Link to CTSA Federated Access Consortium • http://www.ctsaweb.org/federatedhome.html

  4. Synopsis • I attended the Second Annual CSTA Research Management Workshop. • Key areas covered - improvements in contracts and IRB approval processes. • Michael Joyner – Professor of Anesthesiology, Associate Director of CTSA, Deputy Director of Research, Mayo Clinic • Use of Metrics in Process improvement • Reviewed current process at Mayo and the improvements they have achieved. (468 days to approve in 2005 – 117 days today) • Noted – that if you start with a bad paper process and model an electronic equivalent – you get a bad electronic solution – process needs to be fixed first

  5. Briggs Morrison, Senior Vice President , Pfizer • Sponsored Trials at Academic Medical Centers • Industry tend not to choose Academic sites due difference in agendas • Quality issues • Patient experience • Relevant patients • Ability to follow protocol • Data completeness • Speed – slower to get started and contracted • Trial initiation • Enrolling patients • Cost • Inability to validate tools – prevents data collected to be used in an FDA audit

  6. Joseph Comardo, Senior Vice President Wyeth • Industry’s Response to Concerns of Academic Centers • Expanded Scope and complexity • More resources required • Multiple parties • Contract issues • Industry is “Goal” directed / not the objective of Academic Research • Industry lacks understanding of the academic mission • Require a standard process across all therapeutic areas – globally • Country – wide Master agreements??? • Standardized tools • Standardized templates • Access using a single logon (ability to sync global security to local security)

  7. Jonathan Kagan, associate Director , Division of Clinical Research NIAID • Time Based Process Analysis for NIAID Extramural HIV/AIDS • Quality vs. Value of studies • Diminish value related to frustration with amount of time it takes to get things done • IRB approval • Scientific Review • Interaction- Industry /Tech transfer • Adequacy of resources

  8. Sheila Prindiville, Director, Coordinating Center for Clinical Trials, NCI NCI View of Clinical Research at Academic Centers • Prioritization / Scientific Quality • Involve stakeholders in design and prioritization of clinical trails that address the most important questions • Coordination • Through data sharing and incentives for collaboration • Standardization • A standardized informatics infrastructure and clinical research tools • A common clinical trials language (contracts (START),CDEs , CRFs…) • Operation Efficiency • Efficient use of resources – improve cost effectiveness, accrual rates and rapid trial initiation • Enterprise-Wide/ Integrated Management

  9. Breakout Sessions • There were 6 breakout sessions I attended each of them for about 20 minutes each. – essentially working group sessions • Electronic Systems for IRB and CRM • Re-Engineering Contracts • IND/IDE topics • Central Model of CT Support • IRB review for Multicenter Studies • Educational Resource Sharing

  10. Panel Discussion / closing thoughts • Amazing how little one hand knows about the other (NIH) • Accrual is the final frontier (need to find a better way to direct and find trials for patients • Academic/Industry/NIH – different • Have sites be required to post their metrics (should this be part of the RFA – make it easier for all to understand the rules) • Performance measures • Metrics • Reporting

More Related