1 / 38

Smarter Balanced Assessment Consortium:

Smarter Balanced Assessment Consortium:. Next Generation Assessment. What’s Next?. Overview of the Smarter Balanced Assessment Consortium Significant Assessment Shifts Types of Items Cognitive Rigor Matrix Deconstruction of a Performance Task. A National Consortium of States.

afram
Download Presentation

Smarter Balanced Assessment Consortium:

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Smarter Balanced Assessment Consortium: Next Generation Assessment

  2. What’s Next? • Overview of the Smarter Balanced Assessment Consortium • Significant Assessment Shifts • Types of Items • Cognitive Rigor Matrix • Deconstruction of a Performance Task

  3. A National Consortium of States • 28 states representing 44% of K-12 students • 21 governing, 7 advisory states • Wisconsin is a governing state

  4. SBAC Balanced Assessment System Summative assessments Benchmarked to college and career readiness Teachers and schools have information and tools they need to improve teaching and learning Common Core State Standards specify K-12 expectations for college and career readiness All students leave high school college and career ready Teacher resources for formative assessment practices to improve instruction Interim assessments Flexible, open, used for actionable feedback

  5. Faster results • Turnaround in weeks compared to months Using Computer Adaptive Technology for Summative and Interim Assessments Shorter test length • Fewer questions compared to fixed form tests Increased precision • Provides accurate measurements of student growth over time Tailored to student ability • Item difficulty based on student responses Greater security • Larger item banks mean that not all students receive the same questions Mature technology • GMAT, GRE, COMPASS (ACT), Measures of Academic Progress (MAP)

  6. Significant Assessment Shifts • Performance Tasks • Close Reading • Informational Text • Analytical Writing • Technology Enhanced Questions

  7. SBAC Evidence-Based Design • Identify what students should know and be able to do to demonstrate readiness for college and career: Four Claims

  8. Turn & Talk: Four Claims • Students can read closely and analytically to comprehend a range of increasingly complex literacy and informational texts • Students can produce effective and well-grounded writing for a range of purposes and audiences • Students can employ effective speaking and listening skills for a range of purposes and audiences • Students can engage in research/inquiry to investigate topics, and to analyze, integrate, and present information

  9. SBAC Evidence-Based Design • Identify the kinds of evidence that would be sufficient to support the claims. These evidence statements are: Assessment Targets • Turn & Talk-What kinds of assessment do you typically use in your classroom?

  10. SBAC Item Types • Selected Response • Constructed Response • Performance Tasks • Technology Enhanced • Complex Thinking Skills • Analysis • Synthesis • Critical Thinking

  11. SBAC Item Types • Selected Response • 1 Claim • 1 Assessment Target • Multiple Choice-may have multiple answers. 1-2 minutes per item • Constructed Response • 1 Claim • 1 Assessment Target • Short or long responses scored according to a rubric • 5-10 minutes per item

  12. SBAC Item Types • Performance Tasks • Multiple Claims • Multiple Targets

  13. SBAC Cognitive Rigor Foundation • What is cognitive rigor? • Write down your definition • Discuss with your table

  14. Now let’s apply your rigor definition… Your class has just read some version of Little Red RidingHood. • What is a basic comprehension question you might ask? • What is a more rigorous question you might ask?

  15. Developing the Cognitive Rigor Matrix Different states/schools/teachers use different models to describe cognitive rigor. Each addresses something different. •   Bloom –What type of thinking (verbs) is needed to complete a task? • Webb –How deeply do you have to understand the content to successfully interact with it? How complex is the content?

  16. Bloom’s Taxonomy [1956 ] &Bloom’s Cognitive Process Dimensions [2005]

  17. Webb’s Depth-of-Knowledge Levels Webb's DOK

  18. Webb’s Depth-of-Knowledge Levels • DOK-1 – Recall & Reproduction - Recall of a fact, term, principle, concept, or perform a routine procedure • DOK-2 - Basic Application of Skills/Concepts - Use of information, conceptual knowledge, select appropriate procedures for a task, two or more steps with decision points along the way, routine problems, organize/display data, interpret/use simple graphs • DOK-3 - Strategic Thinking - Requires reasoning, developing a plan or sequence of steps to approach problem; requires some decision making and justification; abstract, complex, or non-routine; often more than one possible answer • DOK-4 - Extended Thinking - An investigation or application to real world; requires time to research, problem solve, and process multiple conditions of the problem or task; non-routine manipulations, across disciplines/content areas/multiple sources

  19. Why Depth of Knowledge (DOK)? Mechanism to ensure that the intent of the standard and the level of student demonstration required by that standard matches the assessment items (required under NCLB) To ensure that teachers are teaching to a level that will promote student achievement

  20. Same Verb—Three Different DOK Levels DOK 1-Describe three characteristics of metamorphic rocks. (Requires simple recall) DOK 2-Describe the difference between metamorphic and igneous rocks. (Requires cognitive processing to determine the differences in the two rock types) DOK 3-Describe a model that you might use to represent the relationships that exist within the rock cycle. (Requires deep understanding of rock cycle and a determination of how best to represent it)

  21. Turn and Talk • Using the Cognitive Rigor Matrix, discuss your Little Red Riding Hood questions with your tablemates. • What DOK level would you assign to each of your questions and why? • How would you describe the differences between DOK 2 and DOK 3? • How would you describe the differences between DOK 3 and DOK 4?

  22. The CR Matrix: A Reading ExampleBack to Little Red Riding Hood…

  23. Some general rules of thumb… • If there is only one correct answer, it is probably level DOK 1 or DOK 2   • DOK 1: you either know it (can recall it, locate it, do it) or you don’t • DOK 2 (conceptual): apply one concept, then make a decision before going on applying a second concept •  If more than one solution/approach, requiring evidence, it is DOK 3 or 4 • DOK 3: Must provide supporting evidence and reasoning (not just HOW solved, but WHY – explain reasoning) • DOK 4: all of “3” + use of multiple sources or texts

  24. Take-Away Message: Cognitive Rigor& Some Implications for Assessment • Assessing only at the highest DOK level will miss opportunities to know what students do & don’t know – go for a range; end “high” in selected/prioritized content •  Performance assessments can offer varying levels of DOK embedded in a larger, more complex task • Planned formative assessment strategies and tools can focus on differing DOK levels

  25. Turn & Talk: Reflecting on your own learning • Revisit your definition of rigor – has it changed/been refined? In what way? •  What is one way you might apply these ideas in your work? • What existing curriculum/assessment materials could you/your school examine for a range of cognitive rigor? • Classroom/instructional practices?

  26. SBAC Sample Items http://sampleitems.smarterbalanced.org

  27. SBAC Selected Response Example • Read the sentence from the text. Then answer the question:  “Nanodiamonds are stardust, created when ancient stars exploded long ago, disgorging their remaining elements into space.”   • Based on the context of the sentence, what is the most precise meaning of disgorging? A. scattering randomly B. throwing out quickly C. spreading out widely D. casting forth violently

  28. SBAC Constructed Response Example • In the space below, identify the sentences from the paragraph that are unnecessary, and briefly explain why each one should be removed.

  29. SBAC Performance Tasks Structure of Performance Task

  30. Performance Tasks • Primary Claims to be Measured • Writing-narrative, research, possibly reading • Writing-informational/explanatory, research, possibly reading • Writing-argumentative, research, possibly reading • Writing-opinions, research, possibly reading • Speaking, research, reading, listening

  31. Performance Tasks • Task Overview • Classroom Activity • Student Tasks: • Part 1 : Read, research and respond to research and possible reading questions • Part 2: Respond to a writing or speech prompt • Task Specifications and Scoring Rubrics

  32. Performance Task Sample • Which claims does this task address? • What do you think the assessment targets are? • What do students need to be able to do to complete this task? • Which standards does this task address? • What vocabulary do students need to have in order to complete this task? • What is the Depth of Knowledge level required for this task? • What are the implications for your classroom?

  33. Performance Task Sample • Discuss the Task Specifications : • 11th on page 10 • 6th on page 8 • Did you agree with the claims, assessment targets, standards and DOK level? • Peruse pages 12 – 15. How do these scoring rubrics compare to your classroom expectations? How might you change your classroom expectations to align with these rubrics and scoring information?

  34. “In two years from now, if you are teaching almost the same lessons that you have always taught, then you have not adopted the CCSS. These standards demand a new way of teaching and assessing.” Tony Frontier Standards Based NOT Standards Referenced

  35. Teachers are the Key “Teachers must be the primary driving force behind change. They are best positioned to understand the problems that students face and to generate possible solutions.” James Stigler and James Hiebert, The Teaching Gap

  36. Quality Instruction Makes A Difference • “Good teaching can make a significant difference in student achievement, equal to one effect size (a standard deviation), which is also equivalent to the affect that demographic classifications can have on achievement.” • Paraphrase Dr. Heather Hill, University of Michigan

  37. Research has indicated that... “teacher quality trumps virtually all other influences on student achievement.” (e.g., Darling-Hammond, 1999; Hamre and Pianta, 2005; Hanushek, Kain, O'Brien and Rivken, 2005; Wright, Horn and Sanders, 1997)

More Related