1 / 41

Best Practices for Building Tests and Quizzes

Best Practices for Building Tests and Quizzes. League for Innovation November 2004. 7 question types. multiple choice true/false essay/short answer matching ordering fill in the blank multiple answers. Part II-Instructional Design Considerations for Creating Tests.

outen
Download Presentation

Best Practices for Building Tests and Quizzes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Best Practices for Building Tests and Quizzes League for Innovation November 2004

  2. 7 question types • multiple choice • true/false • essay/short answer • matching • ordering • fill in the blank • multiple answers

  3. Part II-Instructional Design Considerations for Creating Tests

  4. Selected Response Questions • Selected response questions require students to select from a predetermined list of potential answers. • multiple choice • true false • Matching • fill-in-the blank questions

  5. Created Response Questions • Ex. Essays measure the student’s ability to communicate effectively, not just their understanding of content. • Easier to write but harder and (more subjective) to grade.

  6. Match Question Type to Level of Assessment Desired • Multiple choice and matching questions • offer the most flexibility in terms of content covered • thinking skills that can be assessed • True/false are usually limited to fact recall.

  7. Match Instructional Objectives • Use lesson plans or teacher notes to accurately reflect content that was covered in class • Choose the most important objectives to assess and use these as the outline for your test

  8. Cover Important Material • Facts, definitions, comprehension, analysis, applications • Trivial items result in trivial studying and learning.

  9. Items Should be Independent • Do not “give away” answers via information in other questions • Independence maximizes breadth of coverage

  10. Write Simply and Clearly • Measure knowledge of material and concepts, not vocabulary • Ambiguous questions create error, frustration, and compound biases related to language and disability

  11. Clearly Specify What Type of Response is Sought • How long or short an answer is sought? • Should they show their work? • Do you want description? comparisons? application? evaluation? • Whose opinion do you want (book, lecture, their own)?

  12. Good Tests Take Time to Write • Give yourself enough time to evaluate items after a day or two • Revise, edit, and ask others to read before administering the test

  13. Good Tests Have High Degrees of Reliability and Validity • Reliability refers to the extent to which measurement is consistent • Validity refers to the extent to which a test measures what it should.

  14. Poor Example

  15. Construct Questions Carefully • The most important test-making process is selecting the wording of each question and answer. • Question Stem and Distractors

  16. Construction Tips • Construct evaluation items with a single correct answer • Use plausible distracters • Arrange options in a logical sequence • Include a minimum of 3 and a maximum of 5 options • Alternate the order of the correct responses randomly among all options

  17. Construction Tips • Refrain from using the choices “all of the above” or “none of the above” • Identify an initial difficulty level

  18. Construction Tips • Increasing item difficulty can be achieved by paying attention to the distracters (options) • The more “homogeneous” the options and distracters, the higher the degree of difficulty

  19. Higher Degree of Difficulty The first James Bond movie was released in what year? a) 1960 b) 1962 c) 1964 d) 1966

  20. Mid-Level Difficulty The first James Bond movie was released in what year? a) 1960 b) 1962 c) 1964 d) 1970

  21. Low-Level Difficulty The first James Bond movie was released in what year? a) 1958 b) 1962 c) 1970 d) 1975

  22. Points of Concern for Different Formats • Writing True-False Items: • Choose words with precise, definite meanings • Avoid tricks and trivia • Avoid easy clues • Follow guidelines for multiple choice

  23. Writing Multiple Choice Questions • Present the problem, including qualifying statement • There should be only one correct answer • Distracters should be plausible but clearly incorrect • Avoid negative wording (especially double negatives)

  24. Writing Multiple Choice Questions • When item is controversial, indicate whose opinion is sought • Avoid irrelevant cues to correct answer (length, grammar) • Items should test one central idea or concept

  25. Writing Multiple Choice Questions • Watch out for patterns in alternatives (overuse of "C" as correct answer) • Choose appropriate level of difficulty • Assure even coverage of material and types of knowledge

  26. Writing Matching Items • Keep each matching set short (no more than 5 stimuli/responses) • Use longer items as stimuli, shorter as responses • Arrange responses in alphabetical or logical order

  27. Writing Short Answer and Completion Items • Clearly indicate type of answer you want • Do not use more than two blanks per completion item • Make a key before scoring • Periodically re-score early tests to detect shifting criteria

  28. Writing Essay Questions • Limit questions to vitally important material • Clearly define task, scope, and directions for a "good" answer • Allow time for thought • Use multiple medium-length essays rather than one long one

  29. Writing Essay Questions • Use questions that have a limited number of good answers • Allow choice between alternatives (e.g., "answer 3 of the 4 questions")

  30. Part III- Evaluating your Assessment or Assessing Your Evaluation

  31. Evaluating the Test • Once scored, spend some time reviewing the test and observing patterns that may be present. • Were there any questions that every student got wrong? • If so, can you deduce whether it was due to poor test item construction or to instruction?

  32. Evaluating the Test • If poorly written, you may want to consider canceling it out and recalibrating the test. • If students didn’t understand the concept, perhaps the content needs to be retaught.

  33. A Checklist for Teachers • Are your directions CLEAR and CONCISE? • Are your objectives clear? • Is there a logical connection between questions and answers? • Are you being specific in regard to the area you are covering? • Are your answers simple and factual?

  34. Higher Order Thinking • What later events best affirm the ideas set forth in the Declaration of Independence? A) Emancipation Proclamation;19th Amendment B) Eminent Domain, Manifest Destiny C) Civil War

  35. References • Orlich,Donald C, Harder,Robert J.,Callahan,Richard C, and Gibson,Harry W.(1998), Teaching Strategies: A guide to better instruction, 5th Ed, Houghton Mifflin, New York • Royse, David (2001) Teaching Tips for College and University Instructors: A Practical Guide, Allyn and Bacon, Boston

  36. References • Dominowski, Roger L. (2002),Teaching Undergraduates, Lawrence Erlbaum Associates, New Jersey • Paul Eggen, Don Kauchak (2001),Educational Psychology, Windows on Classrooms, 5th Ed, Pub. Merrill- Prentice Hall, Upper Saddle River, New Jersey.

  37. References • Bloom, B.S. (Ed.) (1956) Taxonomy of educational objectives: The classification of educational goals: Handbook I, cognitive domain. New York ; Toronto: Longmans, Green • Chase, C.I. (1999).  Contemporary Assessment for Educators.  New York: Addison Wesley Longman, Inc. 

  38. References • Dominowski, R.L. (2002).  Teaching Undergraduates.  Mahwah, NJ:  Lawrence Erlbaum Associates.  • Friedenberg, L. 1995. Psychological testing: Design, analysis, and use.  Needham Heights: Allyn & Bacon.

  39. References • Kay Burke (1999), The Mindful School: How to Assess authentic Learning Pub.SkyLight Professional Development , Arlington Heights, Illinois • Heinich, R., Molenda, M., Russell, J. D., and Smaldino, S. E. (1996). Instructional media and technologies for learning (5th ed.). (pp. 40-41). Englewood Cliffs: Prentice-Hall.

  40. References • Jacobs, L. C. & Chase, C. I.  (1992). Developing and using tests effectively:  A guide for faculty.  New York:  Jossey-Bass. • Kopeikin, Hal S. (n. d.) UC Los Angeles lecturer, unpublished notes on writing effective questions. • http://web.wi.mit.edu/sive/pub/Lab%20Resources%20Documents/Thoughts%20on%20Exam%20Quests.html

  41. References • http://www.rpi.edu/~verwyc/chap2tm.htm • http://www.glencoe.com/sec/teachingtoday/educationupclose.phtml/print/40 • http://taesig.8m.com/createv.html • http://www.utc.edu/Administration/WalkerTeachingResourceCenter/FacultyDevelopment/Assessment/test-questions.html • http://www.psych.ucsb.edu/~taap/writingtests.html(defunct)

More Related