Download
using risk assessment to drive software validation decisions n.
Skip this Video
Loading SlideShow in 5 Seconds..
Using Risk Assessment to Drive Software Validation Decisions PowerPoint Presentation
Download Presentation
Using Risk Assessment to Drive Software Validation Decisions

Using Risk Assessment to Drive Software Validation Decisions

242 Views Download Presentation
Download Presentation

Using Risk Assessment to Drive Software Validation Decisions

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Using Risk Assessment to Drive Software Validation Decisions Don Hopkins, Ph.D. Ursa Logic Corporation

  2. Software Development Today • Dozens of viable software methodologies currently in use. • Great variation in terminologies, procedures, deliverables, and ways of organizing work. • Each methodology embodies a strategy for safeguarding software quality. © 2004, DQRI. All rights reserved.

  3. Validation Plans Requirement specifications Simulation tests User-site tests Structural tests Functional tests Program build tests Code inspections Code walk-throughs Technical reviews Change control Regression analysis Risk assessments Validation Reports From the Draft Guidance on Software Validation (2001) © 2004, DQRI. All rights reserved.

  4. In what ways can part 11 discourage innovation? By including an across-the-board requirement that covered software must be validated, and publishing a guidance that defined software validation in terms of specific deliverables, the FDA established a de facto standard software development methodology for covered systems, regardless of context and regardless of risk profile. © 2004, DQRI. All rights reserved.

  5. In what ways can part 11 discourage innovation? There has been no agreed-upon language for justifying anything less or anything different than what is outlined in the guidance documents. © 2004, DQRI. All rights reserved.

  6. Risk Assessment • Risk assessment can provide a means of justifying diverse approaches to software development and validation. • Industry and the agency must agree on a common framework for performing risk assessment. © 2004, DQRI. All rights reserved.

  7. DQRI Risk Assessment Model • Standard risk model • Industry consensus on four critical questions © 2004, DQRI. All rights reserved.

  8. DQRI Risk Assessment Model Step 1: Common types of software used in clinical trials, and risks typically associated with them. © 2004, DQRI. All rights reserved.

  9. DQRI Risk Assessment Model Step 5: Outcomes that should be considered in assessing the impact of possible software failures. © 2004, DQRI. All rights reserved.

  10. DQRI Risk Assessment Model Step 6: Factors that contribute to the likelihood of system failures. © 2004, DQRI. All rights reserved.

  11. DQRI Risk Assessment Model Step 8: Dimensions on which validation procedures may vary depending on assessed risk scores. © 2004, DQRI. All rights reserved.

  12. Recommendation #1 The across-the-board requirement for systems to be validated should be removed from Part 11. • Software validation cannot be separated from software development methodologies. • Software validation has no meaning that can be usefully generalized across diverse development methodologies. • Software development methodologies are still emerging; there is no consensus about which methodologies work best under what circumstances. • Software validation is not the only strategy available for controlling software-related risks. © 2004, DQRI. All rights reserved.

  13. Recommendation #2 Instead of mandating software validation, Part 11 should require risk assessment to: (a) identify software-related risks; (b) justify the controls adopted to control those risks. © 2004, DQRI. All rights reserved.

  14. Recommendation #3 The agency should work with industry to develop a guidance on risk assessment for software used in connection with electronic records. • Risk assessment model • Risks associated with common types of software used in clinical trials • Outcomes that should be considered in assessing the impact of possible software failures • Factors that contribute to the likelihood of system failures • Dimensions on which validation procedures may vary as assessments of risk increase or decrease. © 2004, DQRI. All rights reserved.

  15. Conclusion In addition to validation, audit trail, record retention, and record copying, should other areas of Part 11 (e.g., operational system and device checks) incorporate the concept of a risk-based approach? Yes. A risk assessment model can be generalized to provide a framework for justifying decisions about any activity that affects the quality of electronic data. © 2004, DQRI. All rights reserved.

  16. Thank YouDon Hopkins, Ph.D.Ursa Logic Corporationhopkins@ursalogic.com For inquiries about the Data Quality Research Institute (DQRI) please contact:Kaye Fendt, MSPHkfendt@email.unc.edu © 2004, DQRI. All rights reserved.