1 / 33

Connecting Teaching and Learning Through Assessment

Connecting Teaching and Learning Through Assessment. Advanced Data Training III. Today’s Content: Responding Adaptively to Data Analysis. Use prediction, problem identification, problem clarification, hypothesis testing and follow-through from an instructional perspective.

juan
Download Presentation

Connecting Teaching and Learning Through Assessment

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Connecting Teaching and Learning Through Assessment Advanced Data Training III

  2. Today’s Content: Responding Adaptively to Data Analysis • Use prediction, problem identification, problem clarification, hypothesis testing and follow-through from an instructional perspective. • Connections to align with CCSS instruction and assessment will be integrated into this session.

  3. Webinar Format Expect interaction! Guided discussion and cross-site dialogue are planned at critical points in the session.

  4. Review Blog Homework • Summarize responses from blog on 3 questions • What fixes have you used that have resulted in unintended consequences? • What structures may be limiting the success of your strategies? • Which less desired actions are getting reinforced at the expense of more desired actions?

  5. Tools for Responding Adaptively to Your Classroom Data • Prediction • Problem identification • Problem clarification • Solution/Strategy identification • Monitoring implementation of Solution/Strategy

  6. First Steps to Adaptive Response to Data • Prediction: What do I think the data will say about my students’ learning? • What content and skills did I test? • How, when and for how long did we engage with these content and skills? • What other factors may have impacted students’ responses? Attendance, holidays, etc. • How well did students engage in the learning? Number of missed assignments, degree of scaffolding &/or intervention provided during the unit, etc. Modified from TERC. (2011). Using Data: A TERC Initiative. Retrieved from http://usingdata.terc.edu/data_tips/index.cfm

  7. Next Steps: Problem Identification • Problem Identification: What do the data say? Compare your prediction to what the data actually say about your students’ learning. • What can I say about how my students’ performed on this assessment? • What is direct and observable? What patterns or trends are evident? • I notice that…, • I see that…, • I wasn’t expecting that…, • Avoid trying to make meaning at this stage!

  8. Next Steps: Problem Identification (continued) • Problem Identification (continued): What do I think this means? • What can I infer about my students’ learning given these data? • “I think our students may have difficulty applying what they know when they are asked to complete multiple steps to solve a problem.” • “I think this problem may be limited to a subset of our students, not all of them.” • “These results may mean we had success with most of our students by engaging them in group problem solving activities, but our struggling students didn’t make the necessary connections.”

  9. Next Steps: Problem Identification (continued) • Determine what you think your students need next given your inferences about the data. • Before you write a problem statement. • Do I have a clear sense of the problem(s)? • If yes, write your problem statement. • If no, … • Do I need to clarify the problem(s)?

  10. Considerations in clarifying the problem • What if there weren’t enough items to support my inferences? (reliability and measurement error) • What if the level of detail of the assessment doesn’t support instructional decisions (validity) • What if Bobby was just having a bad day? (point in time vs. multiple measures) • What if there weren’t enough students in the group to make to support an inference based on the scores? (generalizability, reliability/volatility of the scores due to N size)

  11. Reflection/Interaction Activity Have you ever jumped from inference to action before you really understood the problem? Think back to the last time you jumped from inference to action without clarifying the problem. Share examples and any unintended consequences.

  12. Next Steps: Problem Clarification • Problem Clarification: Why do I think my students performed as they did? • What might explain the results? Analyze the tasks & use your earlier predictions to jump start your thinking. • “I wonder if students understood the question? Did they know what were they asked to do? Could they start the solution but not finish it? Where might they have gotten lost in the problem statement?” • “Did my students have vocabulary/content/skills to answer the question? Were there Tier 2 or Tier 3 words that were obstacles to comprehending the question?” • “I question whether my highest performing students were just bored and didn’t bother to respond completely.”

  13. Next Steps: Problem Clarification (continued) • What additional information would help me clarify the problem or determine why these results occurred? • “I wonder how these students would do if I presented the individual elements of the problem separately?” • “Which words might some of my students have stumbled over? Where there any words that were essential to understanding the problem or task?”

  14. Use formative assessment and less formal assessments to clarify a Problem Follow up with less formal assessments may clarify the specific content or subskills with which students are struggling, or conversely, if students already have a handle on most of the content/skills.For example, work samples…

  15. Reflection/Interaction Activity Discuss some of the less formal assessment steps you can take to clarify a problem identified in your data analysis. Share across sites.

  16. Using a hypothesis to clarify a problem and test a solution Developing a hypothesis may help you clarify a problem, while considering a possible solution or strategy to address the problem.

  17. Next Steps: Solution/Strategy Identification • What do I do next because of what I learned about my students’ needs from problem clarification? • Hypothesizing, or constructing a theory of action, bridges the gap between data analysis and data use! • From the simple, “If students are struggling to comprehend the text, then specific instructional tasks building Tier 2 vocabulary embedded in the text will improve their comprehension.” • To the more complex, “If students complete tasks that include comparing and contrasting information from primary and secondary sources, then students will develop more complex writing skills.”

  18. How does pattern analysis apply to instructional/assessment planning?

  19. It isn’t Just about Bloom’s: Complexity of texts that were read by students was just as important as what students were asked to do with what was read! Distinguishing between college- and career-ready and those who are not… ACT Reading Between the Lines (2006)

  20. CCSS Connection

  21. Novices who have far to go • Demonstrate basic procedural skill and conceptual understanding. • Thus, • Review of work samples or other informal assessments of these students may reveal where the disconnections are in their basic procedural skills or conceptual understanding. • Interventions for these students target specific deficits in basic procedural skills and conceptual understanding.

  22. Novices who have far to go • Grade-level instruction for these students includes scaffolds to ensure they engage with grade-level content and tasks in a meaningful manner without sacrificing grade-level complexity. • Questions for these students follow the reading of the text to redirect them to re-read difficult sections of the text providing specifics that guide the students to identify key phrases, key statements, specific organizational elements, etc. • Key is to provide explicit instruction without over simplifying (dumbing down) grade-level materials and tasks.

  23. Reflection/Interaction Activity How have your teams managed the process of meeting grade-level instruction and intervention needs? Share successful strategies from sites.

  24. Apprentices that Nearly Meet or Meet Standards • Review of formal or informal assessments indicate these students • Are approaching or are meeting grade-level expectations in terms of procedural skills and conceptual understanding • May regularly meet the standard when confronted with reading simple text or writing simple text. • Able to use these skills and understandings in more complex contexts with scaffolds. • May need scaffolding to build these skills toward generalizability in more complex text and tasks.

  25. Apprentices that Nearly Meet or Meet Standards • CCSS Example (Figure 18, Appendix A) • Language Progressive Skills

  26. Scaffolds help bridge the complexity gap L.8.1d. Recognize and correct inappropriate shifts in verb voice and mood. Goal: Scaffold Independence L.5.1d. Recognize & correct inappropriate shifts in verb tense.

  27. Experts: students exceeding standards • Can use their skills and understanding to navigate or communicate wonderfully complex chains of inference. • Still likely to need scaffolding to master higher levels of text complexity (CCSS, Appendix A ELA).

  28. Experts: students exceeding standards • Challenge  More of the Same or Extra Work • Extend and enrich with more complex contexts and connections that mimic messy real-world scenarios • Scaffold where needed to ensure access to more complex contexts • Move toward increasing independence

  29. Next Steps: Solution/Strategy Identification (continued) • The solutions or strategies you select should • Be grounded in evidence-based, research-based &/or promising practices. • Make sense within your hypothesis &/or theory of action. • In other words, your solutions &/or strategies have a high likelihood of evoking the results you seek!

  30. Final Steps: Monitor Implementation of Solution/Strategy • How will you know you’ve been successful, or whether you need to redirect your efforts or your students’ efforts? • Answer is problem-specific based on whether you are remediating, intervening, advancing, enriching or extending students’ learning. • Percentage of instructional activities for enriching/extending learning for students demonstrating expertise. • Percentage of instructional time spent in intervention for a select group of students.

  31. Final Steps: Monitor Implementation of Solution/Strategy • Monitoring implementation is problem-specific based on whether you are remediating, intervening, advancing, enriching or extending students’ learning. • Percentage of lessons reflecting instruction at cognitive rigor or complexity identified as needed by data analysis. • Percentage of assessments reflecting cognitive rigor or complexity aligned with instruction. • Percentage of intervention time spent explicitly modeling cognitive thinking for students.

  32. Reflection/Interaction Activity • What indicators are you using to monitor whether your strategies or solutions are successful? • How are you monitoring? • What data are you collecting? • How often? • What structures are you using to facilitate monitoring? • How are you using the data you collect on student engagement? • Are these data integrated into your data team analyses?

  33. What We Learned • Actions that bridge the gaps in data analysis and data use. • Prediction • Problem identification • Problem clarification • Solution/Strategy Identification • Monitoring Implementation of Solution/Strategy

More Related