1 / 36

Validation of Predictive Models: Acceptable Prediction Zone Method

Validation of Predictive Models: Acceptable Prediction Zone Method. Thomas P. Oscar, Ph.D. USDA, Agricultural Research Service Microbial Food Safety Research Unit University of Maryland Eastern Shore Princess Anne, MD. Background Information. Terminology. Performance evaluation

wolfe
Download Presentation

Validation of Predictive Models: Acceptable Prediction Zone Method

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation of Predictive Models: Acceptable Prediction Zone Method Thomas P. Oscar, Ph.D. USDA, Agricultural Research Service Microbial Food Safety Research Unit University of Maryland Eastern Shore Princess Anne, MD

  2. Background Information

  3. Terminology • Performance evaluation • Process of comparing observed and predicted values. • Validation • A potential outcome of performance evaluation. • Requires establishment of criteria.

  4. Test Data Interpolation Extrapolation Performance Bias Accuracy Systematic Bias Criteria

  5. Predictive Modeling Secondary Models Tertiary Model No Model Observed No Predicted No Observed N(t) l Model Observed l Predicted l Primary Model Primary Model mmax Model Observed mmax Predicted mmax Predicted N(t) Predicted N(t) Nmax Model Observed Nmax Predicted Nmax

  6. Performance Evaluation Stage 1 Goodness-of-fit Primary/Secondary Models Verification Tertiary Models Stage 2 Interpolation All Models Stage 3 Extrapolation All Models

  7. Test Data CriteriaInterpolation • Independent data. • Within the response surface. • Uniform coverage. • Collected with same methods. Incomplete and biased evaluation Model data (10 to 40C) versus Test data (25 to 40C)

  8. Test Data CriteriaExtrapolation • Independent data. • Outside the response surface. • Only one variable differs. • Collected with same methods. Confounded comparison Strain A in broth versus Strain B in food

  9. Acceptable Prediction Zone MethodDescription

  10. Relative Error (RE) RE for  = (predicted - observed)/predicted RE for N(t), No, max and Nmax = (observed - predicted)/predicted RE < 0 are “fail-safe” RE > 0 are “fail-dangerous”

  11. Performance Factor%RE = REIN/RETOTAL

  12. Performance Criteria • Acceptable Predictions -0.30 < RE < 0.15 for mmax -0.60 < RE < 0.30 for l -0.80 < RE < 0.40 for N(t), No, Nmax • Acceptable Performance %RE => 70

  13. Acceptable Prediction Zone MethodDemonstration

  14. Model Development Design • Salmonella Typhimurium • No = 4.8 log CFU/g • Sterile cooked chicken • 10, 12, 14, 16, 20, 24, 28, 32, 36, 38, 40C • Viable counts • BHI agar • 12 per growth curve

  15. Performance Evaluation DesignSecondary Models (Interpolation) • Salmonella Typhimurium • No = 4.8 log CFU/g • Sterile cooked chicken • 11, 13, 15, 18, 22, 26, 30, 34, 37, 39C • Viable counts • BHI agar • 12 per growth curve

  16. Primary ModelLogistic with Delay N = No if t N = Nmax/(1+[(Nmax/No)-1]exp[-max (t-)]) if t > 

  17. Primary Model PerformanceGoodness-of-fit

  18. Secondary Model for No No = mean No

  19. No Model Performance

  20. Secondary Model for lHyperbola with Shape Factor  = [41.47/(T - 7.325)]1.44

  21. l Model Performance

  22. Secondary Model for mmaxModified Square Root max = 0.01885 if T 11.43 max = 0.01885 + [0.004325(T – 11.43)]1.306if T > 11.43

  23. mmax Model Performance

  24. Secondary Model for NmaxAsymptote Model Nmax = exp(2.348[((T – 9.64)(T – 40.74))/((T – 9.606)(T – 40.76))])

  25. Nmax Model Performance

  26. Predictive Modeling Secondary Models Tertiary Model No Model Observed No Predicted No Observed N(t) l Model Observed l Predicted l Primary Model Primary Model mmax Model Observed mmax Predicted mmax Predicted N(t) Predicted N(t) Nmax Model Observed Nmax Predicted Nmax

  27. Tertiary Model PerformanceVerification %RE = 90.7

  28. Comparison of Models Fisher’s exact test; P = 0.48, not significant.

  29. Performance Evaluation DesignTertiary Model (Interpolation) • Salmonella Typhimurium • No = 4.8 log CFU/g • Sterile cooked chicken • 11, 13, 15, 18, 22, 26, 30, 34, 37, 39C • Viable counts • BHI agar • 4 per growth curve

  30. Tertiary Model PerformanceInterpolation

  31. Tertiary Model PerformanceInterpolation %RE = 97.5

  32. Should the validated tertiary model be used to predict chicken safety? • Evaluation for extrapolation to: • other initial densities (No)  • other strains • other chicken products

  33. Performance Evaluation DesignTertiary Model (Extrapolation) • Salmonella Typhimurium • No = 0.8 log CFU/g • Sterile cooked chicken • 10, 12, 14, 16, 20, 24, 28, 32, 36, 40C • Viable counts • BHI agar • 4 per growth curve

  34. Tertiary ModelExtrapolation to low No

  35. Tertiary Model PerformanceExtrapolation to low No %RE = 2.5

  36. Conclusions • Criteria are important for evaluating performance of models. • Consensus on validation would improve the quality and use of predictive models in the food industry.

More Related