1 / 26

Classroom Assessment A Practical Guide for Educators by Craig A. Mertler

Classroom Assessment A Practical Guide for Educators by Craig A. Mertler. Chapter 8 Objective Test Items. Introduction. Traditional assessment techniques—objective and subjective test items—have been used in classroom for years.

vilmos
Download Presentation

Classroom Assessment A Practical Guide for Educators by Craig A. Mertler

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Classroom AssessmentA Practical Guide for Educatorsby Craig A. Mertler Chapter 8 Objective Test Items

  2. Introduction • Traditional assessment techniques—objective and subjective test items—have been used in classroom for years. • There is a tendency for educators to believe that their development is simple and straightforward. • “…developing a test is easy; developing a good test requires knowledge, skill, [and] time…” (Gallagher, 1998)

  3. General Characteristics of Objective Test Items • Objective test items: those with a single correct response; regardless of who scores a set of responses, an identical score will be obtained. • Objective implies that subjective judgments of the scorer do not influence an individual’s score. • Also known as “selected-response” and “structured-response” items. • Include multiple-choice, matching, and alternate-choice items. • Typically assess lower-level skills such as knowledge, comprehension, and application (higher-order items are much more difficult to write).

  4. General Characteristics of Objective Test Items • Objective test items (continued) • Are relatively easy to administer, score, and analyze. • Writing high-quality items does require substantial time. • Although subjectivity is removed from the scoring process, substantial degree of subjectivity exists in determination of content to be covered by items. • Guessing is a distinct possibility. • Poor readers may be unjustly penalized.

  5. General Characteristics of Objective Test Items • General guidelines for writing objective test items Writing Objective Items: General Guidelines 1. Objective test items should cover important content and skills. 2. The reading level and vocabulary of each item should be as elementary as possible. 3. Each objective item should be stated in an unambiguous manner, and confusing sentence structure and wording should be avoided. 4. Objective items should not consist of verbatim statements or phrases lifted from the text. 5. Clues to the correct answer should not be provided.

  6. General Characteristics of Objective Test Items • General guidelines for formatting objective tests Formatting Objective Tests: General Guidelines 1. Vary the types of items that appear on classroom tests. 2. Group items similar in format together so that each type appears in a separate section. 3. Each section should be preceded by clear directions. 4. Within each section, order the items from easiest to most difficult. 5. Although all item types will not appear on every test, they should be arranged in the following order: true-false, matching, short answer, multiple-choice, and essay. 6. Provide adequate space for students to respond to each item. 7. Avoid splitting an item between two pages.

  7. General Characteristics of Objective Test Items • General guidelines for writing objective test items • Begin test development with table of specifications. • Chart shows the relationship between objectives, content, and Bloom’s taxonomy. • Rows specify major categories of content. • Columns represent six levels of Bloom’s cognitive domain. • Cells are filled in with the number or percentage of items to be developed for that content and at that level.

  8. General Characteristics of Objective Test Items • General guidelines for writing objective test items (continued) • Provide assurance that tests cover a representative and accurate sample of the content. • Option of using published objective tests. • Consider a basic question: Do the items match the instruction provided to students? • Textbook must be carefully examined for alignment. • Scoring is relatively easy. • Use an answer key to arrive at the total number of items the student answered correctly.

  9. Types of Objective Test Items • Multiple-Choice Items • Basic format consists of a stem and responses (one of which is correct; others are called distractors). • Stem may be written as either a question or an incomplete statement. • Three to five options. • Useful for assessing recall of facts and application of knowledge. • Can be used to assess higher-order thinking skills (much more difficult to write).

  10. Types of Objective Test Items • Multiple-Choice Items (continued) • Guidelines for development: Creating Multiple-Choice Items: General Guidelines 1. The stem should clearly present the problem to be addressed by the student. 2. All response options should be parallel in type of content. 3. Options should avoid the use of repetitive words. 4. Adjectives or adverbs that substantially alter the meaning of a stem or option should be emphasized. 5. All distractors in the response set should be plausible. 6. The grammar in each option should be consistent with the stem. 7. Items should avoid the inclusion of “all of the above,” “none of the above,” or any equivalents as response options. 8. The use of absolute terms should be avoided. 9. Items should remain independent of one another. 10. Avoid nongrammatical clues such as a key term appearing inboth the stem and correct response.

  11. Types of Objective Test Items • Multiple-Choice Items (continued) • Advantages • Allow a test to comprehensively and efficiently sample the content domain. • Can be used in virtually all subject areas. • Can be scored relatively quickly. • Scoring is an objective process. • Can provide diagnostic information. • Limitations • Susceptible to guessing. • Can be quite time consuming to construct.

  12. Types of Objective Test Items • Multiple-Choice Items (continued) • Variations • Correct-answer and best-answer variations. • Combination of multiple-choice item and short-answer essay. • Metacognitive multiple-choice item (“Explain your response…”). • Provides opportunities for students to reflect on and explain their thinking.

  13. Types of Objective Test Items • Matching Items • Basic format consists of two lists (stimuli and responses). • Should be a one-to-one correspondence between members of the two lists. • Considered to be a special case of multiple-choice item. • Can assess knowledge and comprehension skills. • Especially useful in measuring understanding of concepts or terms that are interrelated.

  14. Types of Objective Test Items • Matching Items (continued) • Guidelines for development: Creating Matching Items: General Guidelines 1. The lists should be homogeneous. 2. The directions (i.e., the basis for matching) must be made clear. 3. Avoid “perfect matching” by placing more items in the response list than in the stimulus list. 4. Use relatively short lists of stimuli and responses. 5. Place longer phrases in the stimulus list and shorter ones in the response list. 6. Arrange the lists in some logical order.

  15. Types of Objective Test Items • Matching Items (continued) • Advantages • Permit efficient assessment of related facts, ideas, and concepts. • Relatively easy to construct. • Basically a combination of multiple-choice items using the same set of responses. • Scoring is relatively easy. • Limitations • Require large amount of related concepts or ideas. • Very difficult to design for higher-order skills.

  16. Types of Objective Test Items • Alternate-Choice Items • Essentially a special case of multiple-choice items where options are limited to only two choices. • Most popular type is the true-false item. • Variations might include “correct–not correct,” “yes–no,” and “fact–opinion.” • Can be effective if written carefully (despite negative press). • Tend to overestimate student achievement since students have a 50% chance of guessing the correct answer.

  17. Types of Objective Test Items • Alternate-Choice Items (continued) • Guidelines for development: Creating Alternate-Choice Items: General Guidelines 1. Avoid the use of absolute terms and other specific determiners. 2. Avoid testing trivial knowledge. 3. Items should be stated positively; if a negative must be used, underline, bold, or italicize the term. 4. Roughly half of the test items should be keyed true and half false. 5. True statements and false statements should be of equal length. 6. True-false items should be entirely true or entirely false.

  18. Types of Objective Test Items • Alternate-Choice Items (continued) • Advantages • Relatively quick to construct, answer, and score. • Can be scored efficiently and objectively. • Limitations • Highly susceptible to guessing. • Predominantly lower-level skills are appropriately assessed with these items.

  19. Types of Objective Test Items • Alternate-Choice Items (continued) • Variations • Yes-no item • Correction true-false item—requires students to rewrite any statement that is false. • Embedded alternate-choice item—presents a series of alternate-choice items within a paragraph. • Multiple true-false item—hybrid between multiple-choice and alternate-choice items (multiple true-false statements, each using the same stem).

  20. Item Analysis • Item Analysis: Analysis of statistical characteristics of each item appearing on a test for purposes of making decisions about retaining, discarding the items. • Items should be evaluated: • While items are being drafted (using table of specifications, guidelines, etc.). • Following test administration and scoring.

  21. Item Analysis • Item Analysis (continued) • Four basic statistics: • Item difficulty: Proportion of students who answered item correctly. • Item discrimination: Difference between proportion of correct answers in high-scoring and low-scoring groups. • Distractor analysis: Examines patterns of response for incorrect options. • Reliability: Overall consistency across all items.

  22. Item Analysis • Item Analysis (continued) • Item difficulty • Symbolized by p. • Simply divides the number of students who correctly answered an item by the number who attempted the item. • Can range from .00 (difficult) to 1.00 (easy). • Consider revising any item where p < .20 or p > .85.

  23. Item Analysis • Item Analysis (continued) • Item difficulty (continued) • Good judgment should also be used in conjunction with statistical analyses. • Teachers could reasonably expect all students to correctly answer some items; therefore, p = 1.00 indicates that all students have mastered the concept.

  24. Item Analysis • Item Analysis (continued) • Item discrimination • Symbolized by D. • Purpose is to see how well each item discriminates between low- and high-scoring students (on entire test). • If item functions well, most students in high-scoring group will answer it correctly and most students in low-scoring group will answer it incorrectly. • Positively and negatively discriminating items. • Typically ranges from +.10 to +.60; any negative items should be revised or discarded.

  25. Item Analysis • Item Analysis (continued) • Distractor analysis • Informally examines patterns of responses across all options. • Reliability • Calculation of KR-21 reliability coefficient. • Ranges from .00 to 1.00; desirable range for classroom tests is from .70 to 1.00. • Sample item analysis…

  26. Validity and Reliability of Objective Test Items • Validity • Must be able to answer the following: • Am I measuring what I intend to measure? • To what degree do I have confidence in the decisions I will make based on those measures? • Of primary interest is content evidence of validity. • Reliability • Established through the use of statistical analyses, specifically, KR-21 reliability coefficient.

More Related