1 / 29

Developing a Validated Responsible Conduct of Research Test Bank

Developing a Validated Responsible Conduct of Research Test Bank. Holly Phernetton, MPH, MS James DuBois, PhD, DSc Saint Louis University. Planned Collaboration with CITI. With ORI funding, DuBois, Mumford, Phernetton, Antes have developed a RCR test bank:

keon
Download Presentation

Developing a Validated Responsible Conduct of Research Test Bank

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developing a Validated Responsible Conduct of Research Test Bank Holly Phernetton, MPH, MS James DuBois, PhD, DSc Saint Louis University

  2. Planned Collaboration with CITI • With ORI funding, DuBois, Mumford, Phernetton, Antes have developed a RCR test bank: • >125 multiple-choice items aimed at assessing basic knowledge of RCR • 50 “pick 2” ethical decision-making items that test ability to choose optimal responses to ethical problems described in vignettes

  3. Planned Collaboration with CITI • All items will be available for CITI to use at no cost • This presentation describes how we have already validated the test items and how we propose to work with CITI for final construct validation using classical test theory

  4. Background • Office of Research Integrity (ORI) identified 9 core areas that RCR courses should address (2000) • However, there was no guidance or consensus on what content should be taught within these core areas, nor what the overarching objectives of RCR training should be assessed • Further, there is no standard validated test that measures RCR education objectives

  5. Process for Test Bank Development • 3 Stages of Test Development • Delphi Survey to validate objectives and content areas • Development of the Items • Multiple choice items • Pick-2 items • Pilot-testing of items and calculation of psychometric properties

  6. Validation Process • Content Validity • Delphi survey: identify areas to assess • Item development: evidence-based answers • Expert review and revision • Construct Validity • Pilot-test items • Classical test theory

  7. Content Validity • Level 1: Delphi survey • Define objectives and determine important RCR content to be taught and knowledge assessed by consensus panel • Level 2: Item development • According to assessment recommendations from Delphi panel • Evidence-based answers from well-recognized RCR curricula and text books • Level 3: Expert review

  8. Delphi Survey • Funded by ORI • 2006-2007: four panels of experts convened to develop a consensus: identify RCR overarching goals for instruction and assessment • Delphi process: anonymous process • Minimizes groupthink and domineering group members • Maximizes benefits of group decision-making

  9. Delphi Survey • Formed 4 separate panels: each worked independently and simultaneously • Survey conducted online in 3 rounds • Round 1: Open-ended items asking what should be the objectives and content of RCR training • Round 2: Presented lists of edited responses from round 1—participants asked to rate the importance of teaching each topic • Round 3: Presented lists of edited responses from round 2—participants asked to rate importance of teaching and assessing each area

  10. Delphi Survey • 4 Panels • 1. Objectives (n=18) • Overarching objectives and assessed objectives • 2. Scientific Data (n=13) • Data management and research misconduct sections • 3. Scientific Relationships (n=14) • Mentor-trainee responsibilities, collaborative science, conflicts of interest and commitment • 4. Publication and Peer Review (n=13) • Publication practices/responsible authorship and peer review

  11. Delphi Survey Results • 7 Core Areas of RCR instruction examined • Panel Consensus: 2/3 of panelists supporting a rating of “important” or “very important” • Consensus on teaching 43 main topics • Consensus on assessing learning in 21 of the 43 main topics

  12. Content Validity • Level 1: Delphi survey • Define objectives and determine important RCR content to be taught and knowledge assessed by consensus panel • Level 2: Item development • According to assessment recommendations from Delphi panel • Evidence-based answers from well-recognized RCR curricula and text books • Level 3: Expert review

  13. Item Development • Multiple Choice Questions (MCQ) • Developed by Saint Louis University (Dr. James DuBois, Holly Phernetton) • Stem and lead-in question: 1 correct answer and 3 distractors • Pick 2 Items (P2) • Developed by University of Oklahoma (Dr. Michael Mumford, Allison Antes) • Stem: vignette presenting ethical problem • Answer: pick 2 “optimal responses” from list of 8 options

  14. MCQ Item Development • Follow National Board of Medical Examiner Guidelines • Rationale for NBME guidelines. Good test items: • Discriminate between those who know and those who do not know material • Are constructed to reduce correct guessing by “test savvy” participants • Avoid being “tricky”

  15. Rationale Behind the NBME Guidelines • Items contain only critically important content • Must have 1 clearly best answer • No negative questions such as • “each of the following is correct except” • “which of the following is NOT correct” • No absolute terms such as “never” or “always” • Avoid “all of the above” and “none of the above answers” • All responses are of roughly equal length and form

  16. Item Example: MCQ A researcher commits research misconduct at a large university which receives federal funding for research from the Public Health Services. Which of the following is correct regarding institutional obligations in such cases? A. Academic institutions are legally obligated to report all allegations of misconduct to the Office of Research Integrity prior to investigation. B. Institutions must have policies in place for the inquiry, investigation, and adjudication of allegations of misconduct. C. The federal government has to be immediately contacted in every case of alleged misconduct. D. Researchers found guilty of misconduct can never receive PHS funding again.

  17. Item Example: P2 Vignette: Dr. Davis has been culturing a strain of Escherichia coli in an oxygenated broth. For 12 years he has been gradually reducing glucose concentration in the medium. As a mid-career evolutionary biologist, he is interested in the possible evolution of the strain over thousands of generations. He is targeting an extremely unlikely adaptation-the ability to take up and metabolize citrate, something that E. coli strains are notably unable to do. After about 30,000 generations, the adaptation appears. Indeed, the bacteria, which also shows some morphological changes, can thrive in a totally glucose-free medium, and phenotypes with the trait rapidly dominate the population.

  18. Item Example: P2I Vignette, cont’d An investigation revealed a new set of genes in Davis’ E. coli culture consistent with contamination by the common soil bacterium, Serratia marcascens. Yet, analysis could not identify a single living S. marcescens in the culture. S. marcescens readily metabolizes citrate, and the almost certain conclusion is that his E. coli acquired the adaptive trait through lateral gene transfer. This, itself, is an exciting finding, and Davis writes up and distributes a report for prepublication review by the research team. Two members object to Davis’ procedure section where he says that the E. coli culture was experimentally inoculated with S. marcescens. How should Davis respond? Choose two from the following:

  19. Item Example: P2I • A. Remind them that this is his laboratory and he alone makes the final decisions about the content of publications from his laboratory. • B. Agree that the students have a good and valid point, but that the machinery of science is intolerant of accidental findings. • C. Explain to his students that the intention behind a finding is irrelevant to the fact being presented. • D. Revise the paper’s method section to explain how S. marcescens found its way into the culture. • E. Explain that full disclosure would only taint the reputation of the laboratory by admitting the breach in original procedure. • F. Listen to the objections of his students, accept that he is outvoted, and change the procedure section of the paper. • G. Do not offer any explanation other than a reminder that solid publications are what make a career and that if they are uncomfortable with this, they should consider leaving the lab. • H. Acknowledge to the students that they are probably correct and use the data as a basis for a follow-up study on lateral gene transfer under more tightly controlled conditions.

  20. Content Validity • Level 1: Delphi survey • Define objectives and determine important RCR content to be taught and knowledge assessed by consensus panel • Level 2: Item development • According to assessment recommendations from Delphi panel • Evidence-based answers from well-recognized RCR curricula and text books • Level 3: Expert review

  21. Expert Review • 3 Experts to review RCR test items; provide revision, comments • MCQ (all three experts to review) • P2 (one expert to review each area) • Biological Sciences • Social Science • Health Science

  22. Construct Validity • Pilot-test Items • CITI Program: • Implement test bank • Each item will be completed by a minimum of 200 CITI participants • A participant will complete only 20% of total battery of items: 5 MCQ per 7 core areas, 2-3 P2 vignettes with 3 items each (41-44 items total) • Total of 1,000 CITI participants needed

  23. CITI & Construct Validation • CITI program pilot-testing • All items will be pilot-tested on the CITI training program • While participants can re-take test until they pass, only their initial response will be considered in construct validation • Will use classical test theory to validate items: • Will identify top 20% and bottom 20% of test performers • Any items missed more frequently by the top 20% than the bottom 20% will be treated as invalid. Such items will be discarded or revised

  24. CITI Logistics • CITI will upload test items using our Excel file • Program test items to appear in random subsets • After participants have completed all test items, CITI will send de-identified data for statistical analysis • Test bank team will revise items and send CITI a finalized test bank for use by Jan 2010

  25. CITI—Ethical Issues • Participants can re-take the test until passing score achieved • Most items tested are covered by CITI educational content • A few test items will be piloted of which are not directly covered in the CITI training program • These will not be scored, but feedback will be provided • This will ensure fairness and will provide educational value

  26. CITI—Ethical Issues • Will obtain IRB permission • CITI will provide participants with introductory statement and request for permission to use test data • Our approach distinguishes between: • Consent to take test (not needed, because testing is a required component of training and items are already content validated) • Consent to use data (will be requested) in any activities that extend beyond normal QI

  27. Introductory Statement for Participants “Most of the test items you will see address material directly covered in our training program. However, some items also test your ability to generalize to new ethical situations, and still others test your ethical problem-solving skills. In each case, you will be told the correct answer after you complete an item and will have the chance to re-take the test until you achieve a passing score.”

  28. Upon Completion of the Test “CITI is committed to quality improvement and sometimes analyzes de-identified test and survey data to identify ways to improve test items and course content. We may also describe our quality improvement efforts in publications or at conferences using analyses of de-identified test data. Do we have your permission to use your de-identified test data in this manner? Yes No

More Related