1 / 17

Writing Better Test Questions

. Writing Better Test Questions. Live Meeting Presentation <Month> <Date>, <Year>. Introduction. Welcome Live Meeting rules & tools Introducing ourselves. Unit Overview. Lesson 1-Testing Basics Live Meeting #1 (today) Reading Assignment (this week) Lesson 2 – Creating Draft Items

lani-wade
Download Presentation

Writing Better Test Questions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Writing BetterTest Questions Live Meeting Presentation <Month> <Date>, <Year> Writing Better Test Questions

  2. Introduction • Welcome • Live Meeting rules & tools • Introducing ourselves

  3. Unit Overview • Lesson 1-Testing Basics Live Meeting #1 (today) Reading Assignment (this week) • Lesson 2 – Creating Draft Items Live Meeting #2 (next week) Homework (next week) Live Meeting #3 (in two weeks) Group Assignment (due in three week) Live Meeting #4 (in three weeks)

  4. Lesson Overview • Today’s Objectives Li • Today’s Agenda Discuss Pre-work Define scope of this class Introduce key concepts in testing • This Week’s Assignment Read ch 1-3 in Measuring Instructional Results and complete a short quiz in the ELC

  5. What is Testing? • Testing Collecting numerical data about test takers • Measurement Using data to see if something is present • Assessment Systematically gathering data without making judgments • Evaluation Judging the appropriateness of a person or program for a specific purpose

  6. Pre-work Discussion • What were the facts of the case? • What key information was missing from the report as described in the assignment and the discussion posts that you read? • Were the people posting comments emotional about the issue? Why is that?

  7. Matching Definitions Testing Measurement Assessment Evaluation • The average \customer service rep has 14.8 years of education • She proofread the documents, removing all errors. • The program attendees said instruction in spelling was relevant • Our written letters contain an average of 1.5 spelling errors

  8. Kirkpatrick Levels of Evaluation • Level 1 – Reaction Participants react to training • Level 2- Learning Changes in knowledge or skill • Level 3 – Behavior Changes in on-the-job performance • Level 4 – Results Business impact of training program

  9. Kirkpatrick Levels of Evaluation • Level 1 – Reaction Participants react to training • Level 2- Learning Changes in knowledge or skill • Level 3 – Behavior Changes in on-the-job performance • Level 4 – Results Business impact of training program

  10. Reliability • The degree to which the test provides consistent results Equivalence reliability (two forms, same result) Test-retest (same test, same student, same result) Inter-rater reliability (2 judges render same decision) • Name some instances when we need each of these at Liberty

  11. Validity • The degree to which the test measures what it’s supposed to Face validity (test seems reasonable) Content validity (experts affirm test covers key content) Concurrent validity (test shown to distinguish current masters/non-masters) Predictive validity (test predicts future masters)

  12. Validity • As technical training developers, we care most about developing tests that measure the correct material Content validity (experts affirm test covers key content) • “An invalid test is not worth anything, to anybody, at any time, for any purpose.” Sharon Shrock and William Coscarelli, in Criterion-Referenced Test Development (2007)

  13. Types of Tests • Norm-referenced Compare test-takers to each other (Valid) norm-referenced tests are useful when selecting from large numbers of test-takers Great rigor and analysis is used to substantiate their validity/reliability If you believe you have a requirement for norm-referenced testing, contact Dr. Meredith Vey of HRD in Boston

  14. Types of Tests • Criterion-referenced Measure whether test-takers have mastered a particular skill Can’t be used to compare students to one another Potential applications: • Prerequisite • Diagnostic tests • Posttests • Equivalency tests

  15. Methods of Test Construction • Topic-based Make an informed guess at what looks to be important in a given set of content. Never a good idea. • Statistically-based Item responses from large numbers of respondents are chosen to create norm-referenced tests • Objectives-based The heart of Criterion-referenced test development

  16. Citerion-Referenced Test Development Process • This unit focuses on writing better test items. Item-writing is only one step in creating useful tests. • A systematic approach is needed to verify that we are testing the important skills in a way that can be verified as accurate and repeatable.

  17. Citerion-Referenced Test Development Process The CRTD process (Shrock and Coscarelli, 2007) • Analyze Job Content • Establish Content Validity of Objectives • Create Items • Establish Content Validity of Items and Instruments • Conduct Initial Test Pilot • Perform Item Analysis • Create Parallel Forms and Item Banks • Establish Cut-Off Scores • Determine Reliability • Report Scores

More Related