1 / 45

Assessments and Development

Assessments and Development. TECM 5180 Dr. Lam. Assessment Overview. We’ll cover the following tonight: Determining the purpose of assessments and differentiating criterion-referenced and norm-referenced assessments Differentiating four types of assessments

sienna
Download Presentation

Assessments and Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessments and Development TECM 5180 Dr. Lam

  2. Assessment Overview We’ll cover the following tonight: • Determining the purpose of assessments and differentiating criterion-referenced and norm-referenced assessments • Differentiating four types of assessments • Defining three characteristics of good assessments • Differentiating performance and objective assessments • Matching learning styles with assessment item types (i.e., multiple choice, essay, etc.) • Writing individual assessment items

  3. Determine the purpose of an assessment It’s typically not a good idea to create an assessment because you think you “need” to. Instead, determine why you are assessing something in the first place. There are typically 2 common purposes for assessments: • To determine level of competence • To compare or rank learners’ abilities

  4. Criterion-Referenced Assessment Instruments • Fancy term for competency assessments • Very similar to a pass/fail assessment: “Is the learner competent based on some set of criteria?” • E.g., Performance assessment asking a learner to log onto their email. • Doesn’t allow us to rank or compare learners

  5. Norm-Referenced Tests • Fancy term for an assessment that allows ranking • While they allow ranking, they are not useful in pinpointing competencies and gaps in learning • E.g., GRE, SAT, ACT

  6. What do Instructional Designers usually choose? • Criterion-referenced tests (competency) • Match learning objectives to assessment items • Less interested in a normal distribution, more interested in high levels of competency

  7. Criterion-Referenced or Norm-Referenced? • Determine which students should be licensed as physical therapists • Determine which applicants to admit to graduate program with limited positions. • Determine whether a learner is ready to begin the next unit of study. • Determine which students should receive a scholarship when requests for scholarships exceed resources. • Determine how the reading abilities of one state’s students compare to those in another state. • Determine in which areas a learner needs remediation.

  8. Types of Assessments • Entry skills assessments- Designed to test the mastery of perquisite skills. Helps determine if learners are even ready to begin. • Preassessments- Used to determine if learners already have mastered elements in your instruction • Postassessments- Given at the end and used to assess both your learners and your instructional design • Practice Tests- Given during instruction and allows for immediate feedback and real-time practice

  9. Characteristics of Good Assessment Instruments • Validity- How well an assessment measures what it is purported to measure • E.g., An objective assessment of essay questions and multiple choice items that is supposed to test whether a user can create HTML5. • Reliability- How well an assessment instrument yields similar results when repeated • E.g., Personality tests that uses multiple items to test the same • Practicality- How close does the assessment instrument match the context in which learners must perform.

  10. Two Major types of Assessments • Performance Assessments- Learners actually perform some skill or task to demonstrate competency or mastery. • Must have a checklist of criteria created to assess competency • Paper and pencil tests- Also referred to as “objective-style” tests. • Must properly match assessment item type to the learning type specified in your objective

  11. Recall Items • Short answer, fill-in-the-blank, or completion • Best for declarative knowledge objectives • Ask learners to reproduce knowledge

  12. Recognition Items • Multiple choice, matching, true/false • Require learner to recognize or identify the correct answer from a group of alternatives • Assess declarative knowledge that has been memorized. • Can also be used with intellectual skills if constructed well

  13. Constructed Answer Items • Short answer or essay • Require that learners actually produce or construct a response • Better for intellectual skills

  14. Writing Test Items • Read the objective and determine what it wants someone to be able to do (i.e., identify the performance). • Draft a test item that asks students to exhibit that performance. • Read the objective again and note the conditions under which the performing should occur (i.e., tools and equipment provided, people present, key environmental conditions). • Write those conditions into your item. • For conditions you cannot provide, describe approximations that are as close to the objective as you can imagine. • If you feel you must have more than one item to test an objective, it should be because (a) the range of possible conditions is so great that one performance won’t tell you that the student can perform under the entire range of conditions, or (b) the performance could be correct by chance. Be sure that each item calls for the performance stated in the objective, under the conditions called for.

  15. How many test items? • Consider how many items are necessary to gauge mastery or competency • E.g., If our objective is for learners to be able to use the <ol> tag and <ul> tag correctly, would we want a few assessment items, or many? • E.g., If our objective is for learners to to be able to differentiate javascript, HTML, and CSS from examples and nonexamples, would we want a few assessment items, or many? • Consider the possibility of guessing the correct answer • Consider the possibility of knowing the correct answer, but choosing the wrong answer

  16. Multiple Choice Questions Limitations • Difficult and time-consuming to construct • Leads exam writer to favor simple recall of facts (no real depth) Advantages • Can measure all levels of cognitive ability (though best suited for declarative and intellectual skills) • Effective to administer and score • Provides an objective measure of achievement • Allows a wider sample of subject matter (covers a great deal in a short time)

  17. Two types of multiple choice questions • Traditional stem and options • Scenario based- Provide a scenario and present a set of multiple choice questions based on the scenario • Good for higher order thinking (intellectual skills, cognitive skills, etc.) • Stanford rationale approach- provide rationale in addition to the correct answer

  18. Tips for Constructing Multiple Choice Stem • Write as direct question rather than incomplete statement. • Pose definite, explicit, and singular problem • Do not include unnecessary verbiage or irrelevant information • Include any words that might otherwise be repeated in each alternate • Emphasize negatives (if you use them at all)

  19. Tips for constructing multiple choice options • Make stem and alternatives grammatically consistent • Present only one correct or best response • Make approximately equal length • Avoid clues that give away anwer • Grammatical clues (“a” vs. “an”) • Verbal association • Use at least four alternatives • Randomly distribute correct responses • Avoid all or none of the above

  20. Good or bad, and why? Question 1 1A. The promiscuous use of sprays, oils, and antiseptics in the nose during acute colds is a pernicious practice because it may have a deleterious effect on: • the sinuses • red blood cells • white blood cells • the olfactory nerve 1B. Frequent use of sprays, oils, and antiseptics in the nose during a bad cold may result in: • the spreading of the infection to the sinuses • damage to the olfactory nerve • destruction of white blood cells • congestion of the mucous membrane in the nose

  21. Good or bad, and why? Question 2 2A. In 1965, the death rate from accidents of all types per 100,000 population in the 15-24 age group was: • 59.0 • 59.1 • 59.2 • 59.3 2B. In 1965, the leading cause of death per 100,000 population in the 15-24 age group was from: • respiratory disease • cancer • accidents • rheumatic heart disease

  22. Good or bad, and why? Question 3 • 3A. About how many calories are recommended daily for a 14-year old who is 62 in. tall, weighs 103 lbs., and is moderately active? • 1,500 • 2,000 • 2,500 • 3,000 • 3B. About how many calories are recommended daily for a 14-year old who is 62 in. tall, weighs 103 lbs., and is moderately active? • 0 • 2,000 • 2,500 • 3,000

  23. Good or bad, and why? Question 4 • 4A. Which of the following is a category in the taxonomy of the cognitive domain? • Reasoning ability • Critical thinking • Rote learning • All of the above • None of the above • 4B. What is the most complex level in the taxonomy of the cognitive domain? • Knowledge • Synthesis • Evaluation • Analysis • Comprehension

  24. Good or bad, and why? Question 5 • 5A. The mean of a distribution of test scores is the: • Most frequently occurring score • Arithmetic average • 50th percentile • Measure of score range • 5B. A school developed an aptitude test to use for admission to its Honors Program. The test was administered to a group of seven applicants who obtained the following scores: 70,72,72,80,89,94,98. The mean score on the aptitude test is: • 72 • 82 • 80 • 90

  25. True/False Questions Limitations • Extremely high guess factor • Leads exam writer to favor testing of trivial knowledge • Exam writer often writes ambiguous statements when testing higher levels of cognitive skill, due to difficulty in writing unequivocally true/false statements Advantages • Allows widest sampling of subject matter per unit of time (big number in unit of time) • Effectively administered • Objective measurement of achievement

  26. Hints for Constructing True/False Items • Base items on statements that are absolutely true or false, without qualifications or exceptions • Write statements as simply and clearly as possible • Express single idea in each item • Avoid lifting statements from text, lecture, or other materials so that memory alone will not permit correct answer • Avoid use of negatively stated items • Avoid use of unfamiliar vocabulary • Avoid use of specific words (e.g., usually, sometimes, often)

  27. Good or Bad and why? • All spiders have exoskeletons and only prey on insects. • A subject pronoun is used to replace another noun. • Bread and grains are not at the top of the food pyramid. • HTML is often used for content markup. • CSS allows you to control the design and layout of web pages.

  28. Matching Items Limitations • Difficult to measure learning objectives with higher order thinking (anything other than recall of information) • Difficult to select common set of conditions and responses • If options cannot be used more than once, the questions are not mutually exclusive (one incorrect answer means a second is automatically incorrect) Advantages • Simple to construct • Short reading and response time, allowing more content to be included in a given set of matching questions. • Effectively administered • Objective measurement of achievement • Well-suited to measure associations between facts

  29. Tips for constructing Matching Items • Include clear directs and explain basis for matching items • State whether a response can be used more than once • Avoid grammatical cues (e.g., the, a, an) • User longer phrases as questions and shorter phrases as options • Number each question and use alphabetical letters for the options • Make all the questions and all options the same type (e.g., a list of events to be matched with a list of dates)

  30. Good or bad, and why? Question: From Column II, select the name or date that is associated with the statement in Column I. Record your choice on the line preceding the question number. Each answer may be used only one time. Column I ___1.   The year in which the Declaration of Independence was signed. ___ 2.   The first President of the United States. ___ 3.   The year in which the Civil War began. ___ 4.   The baseball player who holds the home run record. ___ 5.   The inventor of bifocals. Column II • George Washington • Benjamin Franklin • Barry Bonds • 1777 • 1861

  31. Good or bad, and why? Question:Several inventions of historical significance are listed in Column I. For each question, select the name in Column II which is associated with that invention. Record your choice on the line preceding the question number. Remember that an answer may be used only one time. time. Column I ____ 1.  airplane____ 2.  steamboat____ 3.  automobile____ 4.  radio____ 5.  ironstoves____ 6.  television Column II a.  John Bairdb.  Sir Frederick Bantingc.  Henry Fordd.  Benjamin Frankline.  Robert Fultonf.  Marchese Marconig.  Orville Wright

  32. Short Answer/Completion Limitations • Difficult to construct so that the desired response is indicated • Typically limited to measurement of simple recall • Possibility of containing more irrelevant clues as compared to other test items • Takes longer to score • Improper preparation can lead to more than one correct answer Advantages • Minimizes guessing • Efficiently measures lower levels of cognitive ability • Series of well-constructed items can measure higher level thinking

  33. Tips for Completion Items • Omit only insignificant words • Avoid grammatical clues • Make sure only one brief, correct response is possible • Make blanks equal in length • Multiple answer blanks should be avoided • The main idea should precede the blank (blank at the end of the statement) • Blanks should require key words • Answer called for should be clear to the trainee

  34. Good or bad, and why? • What is a red corpuscle? _________________ • Which of the cells found in the human body carries oxygen to all other living cells?______________ • What unit of measurement is exactly 1/16th of a pound?_________ • An ____________ weighs less than a pound.

  35. Essay Items Limitations • Time consuming to score • Possibility for subjectivity in scoring • Difficult to measure a large amount of content • Can encourage bluffing • Generally has low test and scorer reliability Advantages • Easier and less time-consuming to construct as compared to other types • Provides means for testing trainees’ ability to compose answer and present in logical manner • Reduces guesses • Can efficiently measure higher order thinking

  36. Tips for Writing Essay Questions • Use questions that can be answered in a short time and space • Question should be specific, phrased so trainee will be able to answer • For questions requiring lengthy response, indicate in outline form information desired • Prepare in advance scoring key that shows acceptable responses and relative weights • Use analytic scoring (pint system) or holistic scoring rubrics • Provide general scoring criteria

  37. Good or bad, and why? • Explain the interval combustion engine? How does it run? • Explain the interrelationship of the fuel, the mixture of fuel and air in the carburetor, and the piston in the operation of an internal combustion engine.

  38. Development Overview • What you already know • What you need to know

  39. What you already know… As a technical communicator, you should already know these important aspects of development: • Importance of time management when producing deliverables (writing process) • Technology (styles in word, inDesign, etc.) • Document design principles • Usability • Research

  40. What you may not know yet… • Where or how to start • Participant’s guide • Other?

  41. Where and how to start your development • Course requirements • Design document (obviously) • Dividing development tasks • Researching and understanding the content

  42. Obvious areas of content • Written content in participant’s guide • Summarized content in presentation slides • Job aids and other handouts

  43. Not so obvious areas for content • Facilitator oral content – general content must be written in facilitator’s guide • Discussion among participants – “script” must be written in the facilitator’s script • Instructions for facilitating a particular discussion • Discussion questions • Time • Follow-up questions

  44. What’s next? • You should begin development this week • Next week, we’ll cover implementation and evaluation (I and E in ADDIE) • We won’t, however, spend too much time on this as you won’t actually be performing these steps for this first project • As a team, you should already be thinking about your video tutorials. Begin brainstorming topics.

  45. Resources • Planning test questions • http://www.utexas.edu/academic/ctl/assessment/iar/students/plan/method/exams.php

More Related