1 / 27

CIHR Team in Microsimulation

CIHR Team in Microsimulation. S imulation T echnology for A pplied R esearch (STAR) Montreal, November 24-25, 2009. Goal of the Meeting. To develop a detailed plan for achieving the objectives of the project, specifically:

neena
Download Presentation

CIHR Team in Microsimulation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CIHR Team in Microsimulation Simulation Technology for Applied Research (STAR) Montreal, November 24-25, 2009

  2. Goal of the Meeting • To develop a detailed plan for achieving the objectives of the project, specifically: • To identify the investigators responsible for each objective/subproject • To specify the final products (software, publications, course syllabus, etc.) • To specify milestones and deadlines for each product

  3. NET Vision • The end-product of this 5-year research program will be • a set of integrated, validated, transparent, and user-friendly disease simulation models • widely known and accessible to policy-makers and researchers within Canada and internationally • supported by extensive documentation and novel substantive results published in highly respected scientific journals

  4. NET subprojects • Model-specific subprojects • General subprojects • KTE-related subprojects • Training-related subprojects • The book subproject

  5. Model-specific subprojects • Breast cancer model: Validation? • Colon cancer model: Validation? • Diabetes model: Development/validation • CHD model: Modification/Validation • OA model: Further development/validation sensitivity analysis, applications For each subproject: • Final product from the NET • Person responsible • Milestones and deadlines • KTE aspects

  6. General subprojects • Validation framework and methods review • Macrosimulation ontology • Software development • Integration of POHEM • Application of integrated model: obesity • Application of integrated model: health inequalities For each subproject: • Final product from the NET • Person responsible • Milestones and deadlines • KTE aspects

  7. KTE-related subprojects • NET website • NET repository • POHEM model documentation • Advisory committee (policy makers) For each subproject: • Final product from the NET • Person responsible • Milestones and deadlines

  8. Training • Trainees and trainee awards • New investigators • Post-doc fellows • PhD students • Master’s students • Summer students • Other • Course on microsimulation • Final product from the NET • Person responsible • Milestones and deadlines • KTE aspects

  9. The Book • Introduction to epidemiological simulation models • Types of models, examples of models • Microsimulation ontology • Model development and validation framework • Sensitivity analysis • Statistical issues • General description of POHEM • Disease-specific POHEM models • Multi-disease POHEM model • Examples of applications • The future of modeling

  10. POHEM-OA

  11. Region Income Education (CCHS) Other risk factors BMI (CCHS) Regression model NPHS 1994-2004 No model at this time No model at this time Age & Sex (CCHS 01) Incidence model NPHS 2000-02 Crude rates BCLHD Crude rates BCLHD OA Diagnosis Co- morbidity Crude rates BCLHD OA Surgery Cost model OA Drugs OA stage BCLHD No model at this time No model at this time Tobit model CCHS 2001 Cost model Regression NPHS Literature Tobit model VGH 2007 Effect of drugs Literature HRQOL (CCHS 2001) Direct Costs Death Side-effects No model at this time Cost model Indirect costs Literature POHEM-OA

  12. POHEM-OA – update • Admin database linked to CCHS (for parameter validation): data for 1991-2004 received, preliminary analyses initiated, waiting for data update (to 2008) • Parameter validation in Ontario and Quebec: preliminary discussions completed, SAS codes will be ready to send out in December/January • Sensitivity analysis: methodology partially developed, will continue next year • Cost module: work will start in January • Obesity application: detailed plan developed, actual simulations to start in December • OA treatment application: preliminary plan developed, simulations to start next year (January/February)

  13. POHEM-OA publication update Journal publications • Kopec JA, Sayre EC, Flanagan W, Fines P, et al. Cibere J, Rahman M, Bansback N, Anis AH, Jordan JM, Sobolev B, Aghajanian J, Kang W, Greidanus NV, Garbuz DS, Hawker GA, Badley EM. Development of a population-based microsimulation model of osteoarthritis in Canada. Osteoarthritis Cartilage (in press) Presentations and abstracts • Sayre EC, Finès R, Flanagan WM, Rahman MM, Kang W, Cibere J, Anis AH, Badley EM, Kopec JA. A Tobit model for predicting Health Utilities Index Mark 3 from osteoarthritis disease duration: a population-based study. To be presented at the Annual Scientific Meeting of the American College of Rheumatology, Philadelphia, October 16-21, 2009 • Kopec JA, Finès P, Flanagan WM, Sayre EC, Rahman M, Bansback N, Cibere J, Anis H, Jordan JM, Badley EM. Projecting the burden of osteoarthritis in Canada using microsimulation. Presented at the Annual Meeting of the European League Against Rheumatism, Copenhagen, June 10-13, 2009. • Finès P, Kopec JA, Flanagan WM, Sayre EC, Rahman M. Microsimulation of osteoarthritis in Canada – Case study of a chronic disease in Canada. Presented at the Meeting of the International Microsimulation Association, Ottawa, June 8-10, 2009.

  14. Model validation: Conceptual issues

  15. Agenda • Validation framework • Validation principles • Question re: validity evidence from examining model development • Questions re: validity evidence from examining model output • Rating of validity evidence • Validation-related subprojects

  16. Validation framework:Sources of validity evidence • Evidence from examining model development Conceptual model validity (theories, definitions, content, structure); Parameter validity (parameters based on expert opinion, literature, data analysis, databases, calibration); Computer program validity (type of simulation, software, code, internal organization) • Evidence from examining model output Plausibility (face validity); Internal consistency; Parameter sensitivity; Between-model comparisons; Comparisons with external data; • Evidence from examining the consequences of model-based decisions

  17. Validation principles • Models gain credibility through thorough development, extensive validation, and use • Full and complete model validation is never possible, validation never ends • A model can be valid (and validated) for one application and not valid (or validated) for another; for example, a model may be valid as an aid to decision making, but not as a forecasting tool, and vice versa • Epidemiological microsimulation models such as POHEM models are developed for multiple purposes and should be validated accordingly

  18. Validation principles – cont. • Model validation studies are of relatively high interest and should not be too difficult to publish • Published validation studies tend to increase model uptake by researchers • Validity evidence based on examining model development process can/should be part of model description • Probably the most powerful validation studies (but also the most difficult to do) are sensitivity analysis, between-model comparisons, and validations against external data • Given the time and other constrains, we need to strike the right balance between model validation and applications

  19. Evidence from examining model development • Question re: Evidence of conceptual model validity, parameter validity and computer program validity • Should we try to include this type of validity evidence in all papers describing POHEM models? • How extensive should this evidence be? Should we follow our own guidelines/framework? • What other documentation should we develop to include those results that are not published or publishable?

  20. Evidence from examining model output Questions re: parameter sensitivity, between-model comparisons, and comparisons with external data • Should this type of validation be part of the NET? • Is this type of validation equally important for all models? • Are all these sources of validity evidence equally important and feasible? • Do we know exactly how to do it? • If we do it, should we aim to publish all the results and if not, what other documentation should we develop to include those results that are not published?

  21. Rating of validity evidence • How many aspects have been validated? • How detailed and transparent is the description of the validation of each aspect? • How extensive is the validation of each aspect (many different approaches)? • How quantitative is the validation of each quantitative aspect?

  22. Validation-related subprojects • Model validation framework • Sensitivity analysis – a review • Disease-specific POHEM models – further development/validation (breast, colon, CHD, diabetes, OA) • Multi-disease POHEM model – description and validation

  23. Example:Model validation framework • Final product: paper/chapter • Person responsible: Jacek • Deadline: February 2010 (submission) • KTE: publication

  24. Training

  25. Agenda • Trainee awards • Course development

  26. Trainee awards • How many • What type • How much • When • For how long • For what subprojects • Review of proposals

  27. Course development • Audience • Content • Level • Delivery • Availability • Persons responsible • Milestones and deadline • KTE aspect

More Related