Sound assessment design
This presentation is the property of its rightful owner.
Sponsored Links
1 / 53

Sound Assessment Design PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Common Formative Assessment. Sound Assessment Design. Draft: 3/30/2013. Welcome and Introductions. Please take a moment to introduce (or reintroduce) yourself to the group, by telling your name, district, and position. Our trainers for the day are…. Common Formative Assessment (CFA).

Download Presentation

Sound Assessment Design

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Sound assessment design

Common Formative Assessment

Sound Assessment Design

Draft: 3/30/2013

Welcome and introductions

Welcome and Introductions

Please take a moment to introduce (or reintroduce) yourself to the group, by telling your name, district, and position.

Our trainers for the day are….

Sound assessment design

Common Formative Assessment (CFA)

Overview and Purpose of CFA

Quality Assessment Design

Developing Meaningful Learning Targets

Quality Assessment Design

Performance Events

Selected Response Items

Constructed Response Items

Learner objectives of the formative assessment series

Learner Objectives of the Formative Assessment Series

  • Understand the clear purposes of assessment by clarifying

    • Why they are assessing

    • Who will use the results of assessment data

    • What they will do with the assessment data

  • Develop clear and meaningful learning targets to guide instruction and student learning.

  • Construct quality assessment instruments which are of sound design and measure pre-determined learning targets.

Outcomes for the day

Outcomes for the Day

As a result of todays training you will…

  • develop a better understanding of the components and characteristics of a quality formative assessment.

  • develop a better understanding of various types of assessment items and the pros and cons of each type.

  • continue using a backwards design approach and a template form to write a formative assessment.

  • evaluate your formative assessment for quality.

Module 3 quality assessment design

Module 3: Quality Assessment Design

Essential Questions:

  • What decisions drive the type of assessment items to use in common formative assessments?

  • What are the essential components needed to create a quality formative assessment?

  • What are the characteristics of quality selected-response, constructed-response and performance tasks?

Session at a glance

Session at a Glance

  • Introductions/Objectives/Outcomes/ Norms

  • Brief Review of Assessment Principles and Reflection

  • Selected Response Items

  • Constructed Response Items

  • Performance Tasks

  • Continue with Assessment Development Process

  • Evaluate Your Test

  • Using Data to Inform Test Writing Skills

  • Implementation Steps, Roadblocks, and Supports

  • Additional Learning

  • Closure



  • Begin and end on time

  • Be an engaged participant

  • Be an active listener – open to new ideas

  • Use notes for side bar conversations

  • Use electronics respectfully

Sound assessment design

Accurate Assessment



What are the learning


What’s the purpose?

Are they clear?

Who will use the results?

Are they good?


What method?

Written well?

Sampled how?

Avoid bias how?

Be sure students

understand targets too!

Studentsare users too!

Studentstrack progress and communication,!

Studentscan participate

in the process too!



How is information



Effectively Used

Source: Adapted from Classroom Assessment for Student Learning: Doing it Right-Using it Well., by R.J. Stiggins, J.Arter, J.Chappuis, & S. Chappuis, 2004, Portland, OR.

Sound assessment design

Accurate Assessment


What method?

Written well?

Sampled how?

Avoid bias how?

Source: Adapted from Classroom Assessment for Student Learning: Doing it Right-Using it Well., by R.J. Stiggins, J.Arter, J.Chappuis, & S. Chappuis, 2004, Portland, OR.

Read and reflect with a shoulder partner

Read and Reflect with a shoulder partner…

Every educator must understand the principles of sound assessment and must be able to apply those principles as a matter of routine in doing their work. Accurate assessment is not possible unless and until educators are given the opportunity to become assessment literate. (They) must understand student achievement expectations and how to transform those expectations into accurate assessment exercises and scoring procedures. (NEA, 2003)

Common Formative Assessments, Larry Ainsworth & Donald Viegut, 2006, Corwin Press, pg 53

What is assessment literacy

What is Assessment Literacy?

“The ability to understand the different purposes and types of assessment in order to select the most appropriate type of assessment to meet a specific purpose.” (Larry Ainsworth, 2006)

Sound assessment design

What evidence do we need that students have met our stated purpose(s)?

“Fruitful assessment often poses the question ‘what is an array of ways I can offer students to demonstrate their understanding and skills?’ In this way, assessment becomes a part of teaching for success and a way to extend rather than merely measure learning.”

Quote by Carol Ann Tomlinson, 1995, taken from Common Formative Assessment by Ainsworth and Viegut

Video clip

Video Clip

Assessment of missouri learning standards

Assessment of Missouri Learning Standards

  • The knowledge, skills and processes specified in Missouri’s Learning Standards (MLS) for Mathematics and English and Language Arts will be measured by Smarter Balanced Assessment Consortium (SBAC) using a variety of test item types…..selected response, constructed response and performance tasks.

  • Sample SBAC items may be viewed on the website:

Let s define

Let’s define…

  • Selected-response assessments

  • Constructed-response assessments

  • Performance Assessments

Pull out this template and pair up with someone. Using your current level of understanding, create a definition, identify the benefits and drawbacks of each type of assessment.

Selected response assessments

Selected-Response Assessments…

  • Require students to select one response from a provided list

  • Types include multiple-choice; true-false; matching; short answer/fill-in-the-blank

  • Also include short answer/fill-in-the-blank (with a listing of answer choices provided)

  • Assess the student’s knowledge of factual information, main concepts, and basic skills

Benefits of selected response items

Benefits of Selected-Response Items

  • Can be scored quickly

  • Can be scored objectively as correct or incorrect

  • Covers a wide range of content

Drawbacks to selected response items

Drawbacks to Selected-Response Items

  • Tends to promote memorization of factual information rather than higher-level understanding

  • Inappropriate for some purposes (performance, writing, and creative thinking)

  • Lack of student writing in most cases, unless part of assessment design (Haladyna, 1997, pp.65-66)

Five roadblocks to effective item writing

Five Roadblocks to Effective Item Writing

  • Unclear directions

  • Ambiguous statements

  • Unintentional clues

  • Complex phrasing

  • Difficult vocabulary

    Popham, 2003b, p.64

Key points for writing sr items

Key Points for Writing SR Items

  • Choose a selected-response format(s) that aligns to the standard being measured

  • Make sure the items will produce the needed evidence to determine mastery of the standard

  • Include the vocabulary of the standard selected for assessment (as appropriate)

  • Make sure the test question(s) require the same level of rigor as that of the standard

  • Write each stem first, then write distractors

Example of a selected response item

Example of a Selected Response Item

Many experts will tell you that television is bad for you. Yet this is an exaggeration. Many television programs today are specifically geared towards improving physical fitness, making people smarter, or teaching them important things about the world. The days of limited programming with little interaction are gone. Public television and other stations have shows about science, history, and technical topics.

Which sentence should be added to the paragraph to state the author’s main claim?

A. Watching television makes a person healthy.

B. Watching television can be a sign of intelligence.

C. Television can be a positive influence on people.

D. Television has more varied programs than ever before.

Example of another selected response item

Example of another Selected Response Item

Smarter-Balanced Assessment Consortium (SBAC) has multiple-choice items that cue students to select more than one answer.

A third example of a selected response item

A third example of a Selected Response Item

Use the illustration and your knowledge of social studies to answer the following question.

5. What colonial claim about the Boston Massacre is supported by this illustration?

  • Most American colonists in Boston were killed.

  • British soldiers fired on unarmed colonists.

  • There were more soldiers than civilians at the Boston Massacre.

  • Colonists were better equipped for war than British soldiers were.

Constructed response items

Constructed-Response Items…

  • Require students to organize and use knowledge and skills to answer a question or complete a task

  • Types include short-answer; open response; extended response; essays

  • More likely to reveal whether or not students understand and can apply what they are learning.

  • May utilize performance criteria (rubrics) to evaluate degree of student proficiency.

Benefits of constructed response items

Benefits of Constructed-Response Items

  • Responses will contribute to valid inferences about student understanding better than those derived from selected-response items.

  • Measure higher-levels of cognitive processes.

  • Allow for diversity in student responses or solution processes.

  • Provide a better picture of students’ reasoning processes.

  • Promote the use of evidence to support claims and ideas.

Drawbacks of constructed response

Drawbacks of Constructed-Response

  • Take longer to score

  • Can have errors in design

  • Dependent on student writing proficiency

  • A challenge to score consistently and objectively

  • Must have clear rubrics for scoring criteria so scoring is not subjective

Video clip1

Video Clip


Key points to writing constructed response items

Key Points to Writing Constructed Response Items

  • Items should be open-ended and require students to create a response

  • Students must demonstrate an integrated understanding of the “unwrapped” concepts and skills

  • Items must match the level of rigor of the “unwrapped” standards

  • A proficient answer reflects the understanding of higher-order instructional objectives

  • Constructed-response items MUST be accompanied by scoring guides.

Example of constructed response item

Example of Constructed Response Item

The table shows the price of different quantities of medium-sized apples at Tom’s Corner Grocery Store. What is the least amount of money needed to buy exactly 20 medium-sized apples if the bags must be sold intact and there is no tax charged? Be sure to show all of your work.

Another example of constructed response item

Another example of Constructed Response Item

Scenario: Your friend is using the computer to type his one-page report for history. It is just two lines over one page and he doesn’t know how to make it fit on one page.

Question: Using the proper computer terminology, describe to your friend two valid solutions for making his report fit on one page without deleting any of the content.

A third example of constructed response item

A third example of Constructed Response Item

Stimulus: Information of Sally’s Experimental Design

Evaluate Sally’s experimental design. Identify two things Sally could have done differently to make her results more valid. Give reasoning for each one of your suggestions.

Performance tasks

Performance Tasks…

  • Require students to construct a response, create a product, or perform a demonstration.

  • Are open-ended and usually allow for diversity in responses or solution processes

  • Are evaluated using scoring criteria given to students in advance of performance

  • Highly engaging for students

  • Promotes critical thinking and/or problem solving

  • Promotes peer and self assessment

Benefits of performance tasks

Benefits of Performance Tasks

  • Have the ability to assess multiple learning targets and application of knowledge

  • Highly engaging for students

  • Promotes critical thinking and/or problem solving

  • Promotes peer and self-assessment

  • Offers multiple opportunities for students to revise work using scoring guide feedback.

Drawbacks of performance tasks

Drawbacks of Performance Tasks

  • Rubrics are more involved and take longer to develop

  • Performances take longer to score

  • Can have error in evaluative design

  • Success is often dependent on factors other than those targeted for assessment (i.e. writing ability, verbal skills, physical abilities, etc.)

  • A challenge to score objectively and consistently

Key points to writing a performance task

Key Points to Writing a Performance Task

  • Student performance should be recorded on a checklist or scoring rubric.

  • Contains a written prompt that cues the student to perform some type of task that requires a demonstration, presentation or product creation.

  • Shows connections by measuring learning targets across strands or content areas.

  • Model what application of learning looks like in life beyond the classroom.

  • Should measure mastery of multiple learning targets and higher-level cognitive processes.

  • May be completed on one or more sittings or over time.

Example of performance task

Example of Performance Task

Part I: During the U.S. Civil War, quilts became a popular item for women to make. You will write an informative essay summarizing the history and purposes of civil war quilts.

To gain the information needed to write your essay, you will watch a video and read two articles about quilts that were made during the Civil War. Take notes because you may want to refer back to your notes while writing your essay.

Part II: Your class is planning a field trip to a history museum. To help prepare for what you will see, write an informative essay about Civil War quilts. In your essay, discuss the history of the quilts, including the reasons people made these quilts during the Civil War, and explain how the quilts were made. Include evidence from the sources in Part I to help support the information you include in your essay. The rubric is provided showing how your essay will be scored.

Another example of performance task

Another example of Performance Task

A third example of performance task

A third example of Performance Task

Reflection time

Reflection Time

Think about the content, skills and processes you teach in your classroom and answer the three questions below.

Module 3 is a continuation of module 2

Module #3 is a continuation of Module #2!

Module 2

Module 3

Now let s practice

Now let’s practice!

Using either the sample CFA Development Template, or personal work created in steps 1 through 5 previously, complete Steps 6 through 9 by selecting the appropriate types of assessments, matching up the test items with the learning target.

Step 10 define achievement levels

Step 10: Define Achievement Levels

When items are written, complete step 10 by describing how information from the scoring guides can be used collectively to determine achievement levels for students. These levels will be used later in the Data Team Process.

Step 11 review and revise

Step 11: Review and Revise

Review test items collaboratively to affirm the quality and appropriateness of the test items.

Evaluate your test

Evaluate Your Test

  • Collectively, do the assessment items produce the necessary evidence to determine whether or not the student has mastered the standard(s) targeted for assessment?

  • Are the assessment items the most appropriate type to use to measure the targeted standard?

  • Do the assessment items require students to demonstrate the same level of rigor as specified by the targeted standard?

  • Are the items worded clearly and concisely?

  • Are the directions clear so students clearly understand what they are to do?

  • Are the items free of any type of bias?

Using data to inform test writing skills

Written by Jana Scott, MAP Instructional Facilitator, University of MO-Columbia, 2007.

Using Data to Inform Test Writing Skills

  • Use of data from a careful item analysis can help a teacher improve his/her test writing skills.

  • Additionally, looking at student results can give the teacher ideas as to improvements that need to be made in instruction and/or curriculum.

  • See next slide.

Possible causes for faulty items low scores on items

Written by Jana Scott, MAP Instructional Facilitator, University of MO-Columbia, 2007.

Possible Causes for Faulty Items/Low Scores on Items

  • Basic content or skills have not been addressed or taught.

  • Students are unfamiliar with the process needed to solve the problem/answer the question (i.e. problem solving, deductive/inductive thinking, making an inference, etc.)

  • Students are unfamiliar with the format needed to answer the question. (i.e. political cartoon, letter, graph, etc.)

  • Students are unfamiliar with the meaning of the language used in the test item. (i.e. compare and contrast, paraphrase, illustrate, evaluate, etc.)

  • Lack of reading ability. Vocabulary used in item stem or stimulus is too difficult.

  • Wording of the item is unclear or confusing.

  • The rubric does not align with the test item. The rubric holds students accountable for something that was not cued in the item stem.

  • The rubric holds students to a standard that is not grade-level appropriate.

  • The item is asking the impossible or improbable (i.e. Asking for two similarities and two differences when there are not that many. Asking for three details when there are not that many.)

  • The stimulus material used as a basis for item development is at fault.



Based on what you have learned today,

  • What steps might you take in order to become a “top notch” writer of formative assessments and of the various types of test items?

  • What potential challenges do you foresee? How might these be overcome?

  • What tools and/or resources might you use to ensure the assessments and assessment items you write are top quality?

Practice profile

Practice Profile

Implementation fidelity

Implementation Fidelity

For additional learning

For Additional Learning

  • In-depth additional training from RPDC staff members on how to write quality selected response items, constructed response items and performance events.

  • Books contained in Bibliography on the next slide.

  • Websites and Videos about Formative Assessment







  • Common Formative Assessments: How to Connect Standards-Based Instruction and Assessment; Larry Ainsworth, Donald Viegut; Corwin Press;2006.

  • Common Formative Assessment Training Manual; Second Edition; The Leadership and Learning Center; Houghton Mifflin Harcourt;2011.

  • Login