1 / 34

Social Science Faculty Meeting

Mastering the Art of Test Writing Roundtable Discussion. Social Science Faculty Meeting. January 2010. Mastering the Art of Test Writing. Presenter Jesse Coraggio , Director, Academic Effectiveness Former Life… Director of Test Development , SMT

alain
Download Presentation

Social Science Faculty Meeting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Mastering the Art of Test Writing Roundtable Discussion Social Science Faculty Meeting January 2010

  2. Mastering the Art of Test Writing Presenter • Jesse Coraggio, Director, Academic Effectiveness Former Life… • Director of Test Development , SMT • Director of Measurement and Test Development, Pearson • Taught EDF 4430 Measurement for Teachers, USF Academic Effectiveness and Assessment

  3. Purpose • This presentation will explain how to create effective multiple choice test questions. • The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers. Academic Effectiveness and Assessment

  4. Objectives • Propose of a Test • Advantages of Objective Tests • Types of Objective tests • Writing Multiple Choice Items • The Test-wise Student • Test Instructions • Test Validity Academic Effectiveness and Assessment

  5. Purpose of a Test • “Clearly delineate between those that know the content and those that do not.” • The purpose of an assessment is to determine whether the student knows the content, not whether the student is a good test-taker. • Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know the material. Academic Effectiveness and Assessment

  6. Objective Tests • Measure several types of learning (also levels) • Wide content, short period of time • Variations for flexibility • Easy to administer, score, and analyze • Scored more reliability and quickly • What type of learning cannot be measured? Academic Effectiveness and Assessment

  7. Types of Objective Tests • Written-response • Completion (fill-in-the-blank) • Short answer • Selected-response • Alternative response (two options) • Matching • Keyed (like matching) • Multiple choice Academic Effectiveness and Assessment

  8. Written-response • Single questions/statements or clusters (stimuli) • Advantages • Measure several types of learning • Minimizes guessing • Points out student misconceptions • Disadvantages • Time to score • Objectivity • Misspelling and writing clarity • Incomplete answers • More than one possible correct response (novel answers) • Subjectivity in grading Academic Effectiveness and Assessment

  9. Completion A word that that describes a person, place or thing is a ________. • Remove only key words • Blanks at end of statement • Avoid multiple correct answers • Eliminate clues • Paraphrase statements • Use answer sheets to simplify scoring Academic Effectiveness and Assessment

  10. Short Answer Briefly describe the term proper noun. ____________________________ • Terminology – Stimulus and Response • Provide an appropriate blank (word (s) or sentence). • Specify the units (inches, dollars) • Ensure directions for clusters of items and appropriate for all items Academic Effectiveness and Assessment

  11. Selected-response Select from provided responses • Advantages • Measure several types of learning • Measures ability to make fine distinctions • Administered quickly • Cover wide range of material • Reliably scored • Multiple scoring options (hand, computer, scanner) • Disadvantages • Allows guessing • Distractors can be difficult to create • Student misconceptions not revealed Academic Effectiveness and Assessment

  12. Alternative Response T F 1. A noun is a person place or thing. T F 2. An adverb describes a noun. • Explain judgments to be made • Ensure answers choices match • Explain how to answer • Only one idea to be judged • Positive wording • Avoid trickiness, clues, qualifiers Academic Effectiveness and Assessment

  13. Matching Item Column AColumn B __Person, place, or thing. a. Adjective __Describes a person, place, or thing. b. Noun Terminology – premises and responses • Clear instructions • Homogenous premises • Homogenous responses (brief and ordered) • Avoid one-to-one Academic Effectiveness and Assessment

  14. Keyed Response Responses a. A noun b. A pronoun c. An adjective d. An adverb ___Person, place, or thing. ___Describes a person, place, or thing. • Like matching items, more response options Academic Effectiveness and Assessment

  15. MC Item Format What is the part of speech that is used to name a person, place, or thing? A) A noun* B) A pronoun C) An adjective D) An adverb Academic Effectiveness and Assessment

  16. MC Item Terminology • Stem: Sets the stage for the item; question or incomplete thought; should contain all the needed information to select the correct response. • Options: Possible responses consisting of one and only one correct answer. • Key: correct response • Distractor: wrong response, plausible, but not correct, attractive to an under-prepared student Academic Effectiveness and Assessment

  17. Competency • Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students. • Assessing lower division students on graduate level material is an ‘unfair’ expectation. • The competent student should do well on an assessment, items should not be written for only the top students in the class. Academic Effectiveness and Assessment

  18. Clarity • Clear, precise item and instruction • Correct grammar, punctuation, spelling • Address one single issue • Avoid extraneous material (teaching) • One correct or clearly best answer • Legible copies of exam Academic Effectiveness and Assessment

  19. Bias • Tests should be free from bias… • No stereotyping • No gender bias • No racial bias • No cultural bias • No religious bias • No political bias Academic Effectiveness and Assessment

  20. Level of Difficulty • Ideally, test difficulty should be aimed a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e, workforce area). Academic Effectiveness and Assessment

  21. Level of Difficulty • To make a M/C item more difficult, make the stem more specific or narrow and the options more similar. • To make a M/C item less difficult, make the stem more general and the options more varied. Academic Effectiveness and Assessment

  22. Trivial and Trick Questions • Avoid trivia and tricks • Avoid humorous or ludicrous responses • Items should be straight forward, they should cleanly delineate those that know the material from those that do not • Make sure every item has value and that it is contributing to the final score Academic Effectiveness and Assessment

  23. http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdfhttp://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines When you don’t know the answer • As with all exams, attempt the questions that are easiest for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you don’t know the answer. Here are some tips to help you guess ‘intelligently’. Use a process of elimination • Try to narrow your choice as much as possible: which of the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable? Academic Effectiveness and Assessment

  24. http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdfhttp://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Look for grammatical inconsistencies • In extension-type questions a choice is nearly always wrong if the question and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.: • The apparent distance hypothesis explains… • b) The distance between the two parallel lines appears… Be wary of options containing definitive words and generalizations • Because they can’t tolerate exceptions, options containing words like ‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often. Academic Effectiveness and Assessment

  25. http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdfhttp://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Be wary of options containing definitive words and generalizations • Because they can’t tolerate exceptions, options containing words like ‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often. Favor look-alike options • If two of the alternatives are similar, give them your consideration. e.g.: A. tourism consultants B. tourists C. tourism promoters D. fairy penguins Academic Effectiveness and Assessment

  26. http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdfhttp://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf Test Taking Guidelines Favor numbers in the mid-range • If you have no idea what the real answer is, avoid extremes. Favor more inclusive options • If in doubt, select the option that encompasses others. e.g.: A. an adaptive system B. a closed system C. an open system D. a controlled and responsive system E. an open and adaptive system. Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank. Academic Effectiveness and Assessment

  27. Test-wise Students • Are familiar with item formats • Use informed and educated guessing • Avoid common mistakes • Have testing experience • Use time effectively • Apply various strategies to solve different problem types Academic Effectiveness and Assessment

  28. Test-wise Students • Vary your keys: “Always pick option ‘C’” • Avoid ‘all of the above’ and ‘none of the above’ • Avoid extraneous information: It may assist in answering another item • Avoid item ‘bad pairs’ or ‘enemies’ • Avoid clueing with the same word in the stem and the key Academic Effectiveness and Assessment

  29. Test-wise Students • Make options similar in terms of length, grammar, and sentence structure. Different options stand out. Avoid ‘clues’. Academic Effectiveness and Assessment

  30. Item Format Considerations • Information in the stem • Avoid negatively stated stem, qualifiers • Highlight qualifiers if used • Avoid irrelevant symbols (“&”) and jargon • Standard Set number of options (Prefer only four) • Ideally, you should tie an item to reference Academic Effectiveness and Assessment

  31. Test Directions Highlight Directions • State the skill measured. • Describe any resource materials required. • Describe how students are to respond. • Describe any special conditions. Academic Effectiveness and Assessment

  32. Ensure Test Validity • Congruence between items and course objectives • Congruence between item and student characteristics • Clarity of items • Accuracy of the measures • Item formatting criteria • Feasibility-time, resources Academic Effectiveness and Assessment

  33. Questions Academic Effectiveness and Assessment

  34. Social Science Faculty Meeting Mastering the Art of Test Writing Roundtable Discussion January 2010

More Related