1 / 26

Validity and Validation Methods

Validity and Validation Methods. Workshop Flow. The construct of MKT Gain familiarity with the construct of MKT Examine available MKT instruments in the field Assessment Design Gain familiarity with the Evidence-Centered Design approach Begin to design a framework for your own assessment

teal
Download Presentation

Validity and Validation Methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validityand Validation Methods

  2. Workshop Flow • The construct of MKT • Gain familiarity with the construct of MKT • Examine available MKT instruments in the field • Assessment Design • Gain familiarity with the Evidence-Centered Design approach • Begin to design a framework for your own assessment • Assessment Development • Begin to create your own assessment items in line with your framework • Assessment Validation • Learn basic tools for how to refine and validate an assessment • Plan next steps for using assessments

  3. Develop Pool of items Refine items Collect/ Analyze Validity Data Refine items Assessment Development Process Define item Template (Define Test Specs) Assemble Test Define item Specs Domain Analysis Document Technical Info Domain Modeling (Design Pattern)

  4. Validity: The Cardinal Virtue of Assessment • The degree to which empirical evidence and theoretical rationales support the adequacy and appropriateness of inferences and actions based on test scores or other modes of assessment. • -- Mislevy, Steinberg, and Almond, 2003 • Validation is a process of accumulating evidence to provide a scientifically sound validity argument to support the intended interpretation of test scores • -- Standards for Educational and Psychological Testing (AERA / APA / NCME, 1999) • Jargon Note: • Two kinds of “evidence”

  5. Assessment Reliability The extent to which an instrument yields consistent, stable, and uniform results over repeated administrations under the same conditions each time Figure obtained from the website: http://www.socialresearchmethods.net/KB/rel&val.htm

  6. Steps of item Validation Iterative Refinement

  7. Expert Panel Review(Formative) • Are the items aligned with… • The test specifications? • Content covered in the curriculum? • State or national standards? • Is the complexity level aligned with intended use (e.g., target population, grade-level)? • Are the item’s prompts and rubrics aligned?

  8. 2. Feasibility of Items (Think-Alouds) • Does the item make sense to the teacher? • Does the item elicit the cognitive processes intended? • Can the item be completed in the available time? • Can respondents use the diagrams, charts, tables as intended? • Is the language clear? • Are there differences in approaches by experts and novices (or teachers exposed or not to the relevant instruction)?

  9. SimCalc SimCalc Example:Think-Alouds Proportional Reasoning Problem #3 Expected proportional reasoning: 3.5 white x white --------- = ------- 3 dark 5 dark Found: Just draw the bars!!

  10. Conducting Think-Alouds • Sample • N: You learn the most in the first 3-6 • Who • Experts and Novices • Low, Medium, and High Achievers • Varying in proficiency in English • Data capture and analysis • Data can be extremely rich analyzed with varying levels of detail • Often sufficient to do real-time note-taking • Videotaping can be helpful • Document • Problems with item clarity (language, graphics) • Response processes – What strategies are they using?

  11. Item-level concerns Are there ceiling or floor effects? What is the range of responses we can expect from a variety of teachers? Is the amount of variation in responses sufficient to support statistical analysis? What is the distribution of responses across distracters? Do the items discriminate among teachers performing at different levels? Assessment-level concerns Are there biases among subgroups? Does the assessment have high internal reliability? What is the factor structure of the test? 3. Field Testing

  12. Key Item Statistic: Percent Correct • What percent of people get it correct? • Gives us a sense of: • The item difficulty • The range of responses • Alerts you to potential problems: • Floor = roughly 0-10% • Ceiling = roughly 85-100%

  13. SimCalc SimCalc Example:Exploratory Results for item #20 Quartiles of total test score

  14. SimCalc SimCalc Example:Exploratory Results for item #43 Skip

  15. SimCalc SimCalc Example:Exploratory Results for item #6

  16. Conducting a Field Test • Test under conditions as close to “real” as possible • Analogous population of teachers • Administration conditions • Formatting • Scoring • Gather and use demographic data • Determine sample size based on • The number of teachers you can get • The kinds of statistical analyses you decide to conduct • e.g., 5-10 respondents per item for fancy statistics • Can use simple and fancy statistics

  17. Field Testing with Teachers by Mail • Purchasing national mailing lists of teachers • http://www.schooldata.com/ • http://www.qeddata.com • Best practices mailing sequence (Cook et al., 2000) • An introductory postcard announcing that a survey will be sent • About a week later, a packet containing the survey • About two weeks later, a reminder postcard • About two weeks later, a second packet containing the survey and a reminder letter • About three weeks later, a ‘third appeal’ postcard

  18. Steps of item Validation Iterative Refinement

  19. 4. Expert Panel Review(Summative) • Similar questions as in Step 1 (Formative review) • Same or different panel of experts • Ratings and alignment collected after items are fully refined • Results of summative expert panel review provide evidence of alignment of items with standards/curriculum, content validity, and grade-level appropriateness • This could be reported in technical documentation

  20. Steps of item Validation Iterative Refinement

  21. Creating a Validity Argument • Integrates all evidence into a coherent account of the degree to which existing evidence and theory support the intended interpretation of test scores

  22. For a Sound Validity Argument,at Minimum, Pay Attention to…

  23. Activity #5Conduct Think-Aloud Be the observer for your own items! • Break into groups of 3 and select roles • 1 interviewer • 1 interviewee • 1 observer to complete observation recording sheet • Select set of 2 items • Conduct think-alouds. Interviewer and observers take notes on the form in the protocol. • Repeat two more times, switching roles, with new items. • Revise your own items. • Following, we will have a discussion about • Insights about development of assessment items • Questions and challenges

  24. Activity #5Think-Aloud Pointers • Find out how long problems take to do • Uncover issues of item clarity and level of difficulty • Derive a model of the knowledge and thinking that the students engage when solving each problem. In observation notes, describe: • How problems are solved, focusing on the underlying knowledge, skills, and structures of item performance • Actions, thought processes, and strategies

  25. Activity #5Think-Aloud Pointers • Interviewers SHOULD • Prompt the teacher to keep talking • Ask clarifying questions about what teachers are saying (but not as scaffolding) • Interviewers SHOULD NOT • Help teachers in any way during the interview (e.g., no hints, tips, or scaffolding). Be sure to avoid unintentional hints by being more encouraging when answers are correct.

  26. Steps of item Validation Iterative Refinement

More Related