slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional Development Fall PowerPoint Presentation
Download Presentation
Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional Development Fall

Loading in 2 Seconds...

play fullscreen
1 / 39

Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional Development Fall - PowerPoint PPT Presentation


  • 6487 Views
  • Uploaded on

Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional Development Fall 2005 Dr. Kristi Roberson-Scott Guiding Principles for Evaluation Evaluation should relate directly to instructional objectives

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional Development Fall' - johana


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional DevelopmentFall 2005

Dr. Kristi Roberson-Scott

guiding principles for evaluation
Guiding Principles for Evaluation
  • Evaluation should relate directly to instructional objectives
  • Each evaluation activity should be designed to promote student growth
    • The actual activity should be useful practice in itself
    • Feedback should be useable by the student
  • Multiple evaluation strategies should be provided to master achievement of X objective/competency
  • Student should clearly understand the methods of evaluation for X test or activity
questions to ask yourself in designing a test
Questions to Ask yourself in Designing a Test
  • What objectives will (should) I be testing?
  • What types of items will be included in the test?
  • How long will the test be in terms of time and number of items?
  • How much will each objective be worth in terms of weighting and number of items?
tests as diagnostic tools
Tests as Diagnostic Tools
  • Students demonstrate learning
  • Instructor effectiveness – modify teaching strategies or activities
  • Assignment of letter grades
different types of tests learning
Different Types of Tests & Learning
  • Paper & Pencil/WebCT Testing
    • Limited Choice Questions (MC, T/F, Matching)
    • Open-Ended Questions (Short Answer, Essay)
  • Performance Testing
    • Acquisition of skills that can be demonstrated through action (e.g., music, nursing, etc.)
planning a test
Planning a Test
  • First step: Outline learning objectives or major concepts to be covered by the test
    • Test should be representative of objectives and material covered
    • Major student complaint: Tests don’t fairly cover the material that was supposed to be canvassed on the test.
planning a test7
Planning a Test
  • Second Step: Create a test blueprint
  • Third Step: Create questions based on blueprint
    • Match the question type with the appropriate level of learning
  • Fourth Step: For each check on the blueprint, jot down (might use 3x5 cards) 3-4 alternative question on ideas and item types which will get at the same objective
  • Fifth Step: Organize questions and/or ideas by item types
planning a test8
Planning a Test
  • Sixth Step: Eliminate similar questions
  • Seventh Step: Walk away from this for a couple of days
  • Eighth Step: Reread all of the items – try doing this from the standpoint of a student
planning a test9
Planning a Test
  • Ninth Step: Organize questions logically
  • Tenth Step: Time yourself actually taking the test and then multiply that by about 4 depending on the level of students
  • Eleventh Step: Analyze the results (item analyses)
translating course objectives competencies into test items
Translating Course Objectives/Competencies into Test Items
  • Syllabus
    • Specification table- what was taught/weight areas to be tested
  • Creating a Test Blueprint (see handout)
    • Blueprint- this is the test plan, i.e., which questions test what concept
    • Plotting the objectives/competencies against some hierarchy representing levels of cognitive difficulty or depth of processing
thinking skills
Thinking Skills
  • What level of learning corresponds to the course content
  • Bloom’s Taxonomy of Educational Objectives
    • Knowledge (see handout)
    • Comprehension
    • Application
    • Analysis
    • Synthesis
    • Evaluation
practical considerations
Practical Considerations
  • Representative sample of the course content not random– purposeful based on blueprint
  • Representative sample of skill or cognitive levels across content
  • Analyze results by level AND content area
question arrangement on a test
Question Arrangement on a Test
  • Group by question type
    • Common instructions will save reading time
  • Limit the number of times students have to change frame of reference
  • Patterns on test must be logical
    • Arrange from a content standpoint
    • Keep similar concepts together
  • Group by difficulty (easy to hard)
selecting the right type of evaluation
Selecting the Right Type of evaluation
  • How do you know what type of question to use and when?
  • It depends on the skill you are testing.
  • Evaluation should always match as closely as possible the actual activity you’re teaching.
    • Examples: Teaching Speech, should evaluate an oral speech
    • If testing ability to write in Spanish, better give an essay.
    • Testing reading –MC, TF
    • Wouldn’t use MC to test creative writing
constructing the test
Constructing the Test
  • Types of Test Questions:
    • Multiple-Choice Items
    • True-False Items
    • Matching Items
    • Fill-In, Completion or Short-Answer Items
    • Essay Questions
multiple choice items
Multiple Choice Items
  • Advantages:
    • Extremely versatile-can measure the higher level mental processes (application, analysis, synthesis and evaluation)
    • A compromise between a short answer/essay and T/F item
    • Can cover a wide range of content can be sampled by one test
  • Disadvantages
    • Difficult to construct plausible alternative responses
types of multiple choice items
Types of Multiple Choice Items
  • Four Basic Types
    • Question Type
    • Incomplete Statement Type
    • Right Answer Type
    • Best Answer Type
  • Which Type is Best?
    • Question Type vs. Incomplete Statement
    • Right Answer vs. Best Answer Type
multiple choice items19
Multiple Choice Items
  • Writing the stem first:
    • Be sure the stem asks a clear question
    • Stems phrased as questions are usually easier to write
    • Stems should not contain a lot of irrelevant info.
    • Appropriate reading level/terms
    • Be sure the stem is grammatically correct
    • Avoid negatively stated stems
multiple choice items20
Multiple Choice Items
  • Writing the correct response
    • Use same terms/reading level
    • Avoid too many qualifiers
    • Assign a random position in the answer sequence
  • Read the stem and correct response together
  • Generate the distractors/alternative responses
multiple choice items21
Multiple Choice Items
  • Other Tips for Constructing MC Items:
    • Items should have 3-4 alternatives.
    • Stem should present a single, clearly formulated problem
    • Simple, understandable, exclude extraneous words from both stem and alternatives
    • Include in the stem any word that are repeated in each response
    • Avoid all of the above (can answer based on partial information)
    • Avoid none of the above
multiple choice items22
Multiple Choice Items
  • Alternative responses/distractors should be plausible and as homogeneous as possible
  • Response alternatives should not overlap
    • Two synonymous terms (arithmetic average/mean)
  • Avoid double negatives
    • None of the following are part of the brain except which one?
  • Emphasize negative wording
  • Each item should be independent of other items in the test
    • Information in the stem of one item should NOT help answer another item.
true false test items
True-False Test Items
  • Best suited for testing 3 kinds of info.:
      • Knowledge level learning
      • Understanding of misconceptions
      • When there are two logical responses
  • Advantages:
    • Sample a large amount of learning per unit of student testing time
  • Disadvantages:
    • Tends to be very easy
    • 50-50 chance of guessing
    • Tends to be low in reliability
tips for constructing true false items
Tips for Constructing True/False Items
  • Tips for constructing True-False Items
    • Avoid double negatives
    • Avoid long or complex sentences
    • Specific determiners (always, never, only, etc.) should be used with caution
    • Include only one central idea in each statement
    • Avoid emphasizing the trivial
    • Exact quantitative (two, three, four) language is better than qualitative (some, few, many)
    • Avoid a pattern of answers
objective test item analyses
Objective Test Item Analyses
  • Evaluating the Effectiveness of Items..
    • Why?
      • Scientific way to improve the quality of tests and test items
      • Identify poorly written items which mislead students
      • Identify areas (competencies) of difficulty
    • Item analyses provided info. on:
      • Item difficulty
      • Item discrimination
      • Effectiveness of alternatives in MC Tests
short answer items
Short-Answer Items
  • Two Types: (Question and Incomplete Statement)
  • Advantages:
    • Easy to construct
    • Excellent format for measuring who, what, when, and where info.
    • Guessing in minimized
    • Student must know the material- rather than simply recognize the answer
  • Disadvantages:
    • Grading can be time consuming
    • More than one answer can be correct
short answer items27
Short Answer Items
  • Tips for Constructing Short Answer Items
    • Better to supply the term and require a definition
    • For numerical answers, indicate the degree of precision expected and the units in which they are to be expressed.
    • Use direct questions rather than incomplete statements
    • Try to phrase items so that there is only one possible correct response
    • When incomplete statements are used, do not use more than one blank within an item.
essay questions
Essay Questions

Types of Essay Questions

  • Extended Response Question
    • Great deal of latitude on how to respond to a question.
    • Example: Discuss essay and multiple-choice type tests.
  • Restricted Response Question
    • More specific, easier to score, improved reliability and validity
    • Example: Compare and contrast the relative advantages of disadvantages of essay and multiple choice tests with respect to: reliability, validity, objectivity, & usability.
essay items
Essay Items
  • Advantages:
    • Measures higher learning levels (synthesis, evaluation) and is easier to construct than an objective test item
    • Students are less likely to answer an essay question by guessing
    • Require superior study methods
    • Offer students an opportunity to demonstrate their abilities to:
      • Organize knowledge
      • Express opinions
      • Foster creativity
essay items30
Essay Items
  • May limit the sampling of material covered
      • Tends to reduce validity of the test
  • Disadvantages
    • Subjective unreliable nature of scoring
      • “halo effect” – good or bad student’s previous level of performance
      • Written expression
      • Handwriting legibility
      • Grammatical and spelling errors
  • Time Consuming
essay questions31
Essay Questions
  • Give students a clear idea of the scope & direction intended for the answer
    • Might help to start the question with the description of the required behavior (e.g., compare, analyze)
  • Appropriate language level for students
  • Construct questions that require students to demonstrate a command of background info, but do not simply repeat that info.
  • If question calls for an opinion, be sure that the emphasis is not on the opinion but on the way its presented or argued.
  • Use a larger number of shorter, more specific questions rather than one or two longer questions so that more information can be assessed.
essay questions32
Essay Questions
  • You might
    • Give students a pair of sample answers to a question of the type you will give on the test.
    • Sketch out a rubric (grading scheme) for each question before reading the papers OR randomly select a few to read and make up the grading scheme based on those answers
    • Give students a writing rubric
    • Detach identifying information and use code numbers instead to avoid letting personality factors influence you.
    • After grading all the papers on one item, reread the first few to make sure you maintained consistent standards
    • Be clear to student the extend to which factors other than content (e.g., grammar, handwriting, etc.) will influence the grade.
essay questions33
Essay Questions
  • Tips for constructing Essay Questions
    • Provide reasonable time limits for each question
      • “thinking and writing time”
    • Avoid permitting students a choice of questions
      • Will not necessarily get a representative sample of student achievement. Only be requiring all students to answer all questions can their achievement be compared
    • A definite task should be put forth to the student
      • Critical words: compare, contrast, analyze, evaluate, etc.
scoring essay items
Scoring Essay Items
  • Write an outline of the key points (use outline to design a rubric)
  • Determine how many points are to be assigned to the question as a whole and to the various parts within it.
  • If possible, score the test without knowledge of the student’s name
    • Face Sheet
  • Score all of the answers to one question before proceeding to the next question
    • Consistent standard
scoring essay exams
Scoring Essay Exams
  • If possible, score each set of answers within the same time frame
  • Handwriting, spelling & Neatness
    • Two separate grades?
      • Mastery of material
      • Other
alternative methods of assessment
Alternative Methods of Assessment
  • Research/Term Papers
  • Research Reviews
  • Reports
  • Case Studies
  • Portfolios
  • Projects
  • Performances
  • Peer evaluation
  • Mastery
  • Simulations
cheating
Cheating
  • Preventing Cheating
    • Reduce the pressure (multiple evaluations)
    • Make reasonable demands (length/content of exam)
    • Use alternative seating
    • Use alternative forms
    • Be cautious with extra copies
using assessment evaluation to improve student learning outcomes
Using Assessment & Evaluation to Improve Student Learning Outcomes
  • Providing feedback to student
  • Closing the assessment & evaluation loop
  • Maximizing student learning