1 / 40

DDI and Assessments in Mathematics: Analyzing and Tracking Data

DDI and Assessments in Mathematics: Analyzing and Tracking Data. at the 6-12 Level May 13, 2014. Session Objectives. Be able to describe what to look for when analyzing student work for a Common Core-aligned assessment Be able to create a data tracker for assessments

lucita
Download Presentation

DDI and Assessments in Mathematics: Analyzing and Tracking Data

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. DDI and Assessments in Mathematics: Analyzing and Tracking Data at the 6-12 Level May 13, 2014

  2. Session Objectives • Be able to describe what to look for when analyzing student work for a Common Core-aligned assessment • Be able to create a data tracker for assessments • Develop questions that drive data-analysis meetings around Common Core-aligned assessment data

  3. Agenda • Introduction • Warm Up: Review of Assessment Design • Looking at Student Work • Examples • Time with work you brought • Tracking Student Work • Leading a Data Meeting with Questions • Q & A

  4. Introduction • Part I: Assessment (February) • Part II: Analysis* (May) • Part III: Action (July)

  5. *Analysis is hard. • We don’t want to “granularize” content… …butwe have to do something to look “under the hood” • We want our students engaged in rich tasks… …butwe want to dig into the work associated with the tasks to learn specifics about what our students know and can do • We don’t want to put rigor in silos or to create a “checklist” for rigor… …but we want usable information about how are kids are doing with respect to the demands of the Common Core

  6. Warm Up: Review of Assessment Design 1. What makes this assessment Common Core-aligned? 2. Critique this. How could it be improved?

  7. Review: What Two Things Make a Great Common Core Assessment? 1. Balance of rigor 2. Variety of levels

  8. Think Aloud… • I knew some of my students were functioning below grade level, so I used the RP domain heading to locate similar understandings at the 6th grade level. This drove instruction for my unit and allowed for more differentiation. • I tried to include a variety of prompts/question types that would offer a balance of rigor. This drove instruction for my unit and ensured a balance of rigor throughout the unit.

  9. Examine Sample Assessment First Focus Question: “Imagine looking at some student work associated with this assessment. What kinds of errors do you think you’d see? What would these errors reveal about students?”

  10. Looking at Student Work Second Focus Question: “Look at the work from Veronica and Englebert. What kinds of errors do you see? What do these errors reveal about students?”

  11. This Evening’s Two Big Ideas: Analyze student work based on: 1. The grade level standard(s) being measured 2. The type of error, viewed through a rigor* lens

  12. *Rigor Means Different Things to Different People • Procedural • Conceptual • Application

  13. Veronica

  14. Veronica

  15. Veronica

  16. Veronica

  17. Veronica

  18. Veronica

  19. Englebert

  20. Englebert

  21. Englebert

  22. Englebert

  23. Englebert

  24. Englebert

  25. Summary Notice: 1. The grade level standard(s) being measured 2. The type of error, viewed through a rigor* lens

  26. Activity • Spend some time with student work that you brought. • What standards are being measured? • What types of errors are being made? • If you didn’t bring any student work, look at the annotated items.

  27. How Do We Track Data?

  28. Tracking The Class

  29. Each Item, Through Multiple Lenses

  30. Useful Disaggregation

  31. Useful Disaggregation

  32. Possible Modifications • Break down data to show strategies employed (e.g., table, equation) • Break down P, C, A further (e.g., “P – Division of fractions”) • Include other “lenses” (e.g., vocabulary, writing) • Also tag items at performance levels, using PLDs • Tag items to more than one standard

  33. Using Questions to Lead Data Meetings “Bambrick Model”—Based on Paul Bambrick-Santoyo’sDriven By Data We’ll look at: • “Pre-Cursors” (what happens before a data meeting) • “Conversation Starters and Re-Directors” (what happens during a data meeting) $64,000 Question: How might these look different using a Common Core-aligned assessment?

  34. “Pre-Cursors”

  35. “Pre-Cursors” • How would we prepare differently for a Common Core assessment meeting? • What different activities would we ask teachers to do? • What different questions would we pose?

  36. “Conversation Starters & Re-Directors”

  37. “Conversation Starters & Re-Directors” • What would be different during a Common Core assessment meeting? • What different activities would we ask teachers to do? • What different questions would we pose?

  38. This Evening’s Two Big Ideas, Revisited: Analyze student work based on: 1. The grade level standard(s) being measured 2. The type of error, viewed through a rigor lens

  39. Session Objectives • Be able to describe what to look for when analyzing student work for a Common Core-aligned assessment • Be able to create a data tracker for assessments • Develop questions that drive data-analysis meetings around Common Core-aligned assessment data

  40. Thanks! • Q & A

More Related