1 st Annual Data Summit: Comprehensive Assessment Systems & Formative Assessment - PowerPoint PPT Presentation

1 st annual data summit comprehensive assessment systems formative assessment n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
1 st Annual Data Summit: Comprehensive Assessment Systems & Formative Assessment PowerPoint Presentation
Download Presentation
1 st Annual Data Summit: Comprehensive Assessment Systems & Formative Assessment

play fullscreen
1 / 86
1 st Annual Data Summit: Comprehensive Assessment Systems & Formative Assessment
123 Views
Download Presentation
isha
Download Presentation

1 st Annual Data Summit: Comprehensive Assessment Systems & Formative Assessment

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. 1st Annual Data Summit: Comprehensive Assessment Systems & Formative Assessment David Abrams Sullivan County BOCES 8/20/2013

  2. Introduction • Validity is a process not a product. • Design assessments to function as a lever for good instructional practices. • Consciously design assessments so that the data produced can be used to: inform instruction; provide evidence of student achievement and/or growth; and facilitate the transition to the Common Core College & Career Ready Standards. • Document the Process.

  3. Assessment System Design: Key Questions/Bright Lines • What do I want to learn from this assessment? • Who will use the information gathered from this assessment? • What action steps will be taken as a result of this assessment? • What professional development or support structures should be in place to ensure the action steps are taken appropriately? • How will student learning improve as a result of using this assessment and will it improve more than if the assessment were not used? (Perie, Marion, & Gong, 2009) (Perie, Marion, & Gong, 2009)

  4. Comprehensive Assessment System:Components Summative: given one time at the end of the semester or school year to evaluate students’ performance against a defined set of content standards. Can be used for accountability or to inform policy and/or can be teacher administered for grading purposes (Perie, Marion, & Gong, 2009).

  5. Comprehensive Assessment System:Components Interim: Assessments administered during instruction to evaluate students’ knowledge and skills relative to a specific set of academic goals in order to inform policymaker or educator decisions at the classroom, school, or district level. The specific interim assessment designs are driven by the purposes and intended uses, but the results of any interim assessment must be reported in a manner allowing aggregation across students, occasions, or concepts (Perie, Marion, & Gong, 2009).

  6. Comprehensive Assessment System:Components Interim Con’t: Key components of interim assessments are: 1) they evaluate students’ knowledge and skills relative to a specific set of academic goals; and 2) they are designed to inform decisions at both the classroom and beyond the classroom level (Perie, Marion, & Gong, 2009).

  7. Comprehensive Assessment System:Components Formative: used by teachers to diagnose where students are in their learning, where gaps in knowledge and understanding exist, and to help teachers and students improve student learning. The assessment is embedded within the learning activity and linked directly to the current unit of instruction.

  8. Comprehensive Assessment System:Components Formative assessments are used most frequently and have the smallest scope and shortest cycle while summative are administered least frequently and have largest scope and cycle. Interim fall between the two (Perie, Marion, & Gong, 2009).

  9. Formative Assessment: Table Discussion-Information Processing How is Formative Assessment being used in your District, Schools, &/or Classroom? What constitutes a “quality formative assessment and how do you know?” What are your goals for implementing Formative Assessment in your school? Does your district have Professional Learning Communities/Whole Faculty Study Groups, vertical & horizontal, in place? Are you comfortable using data to inform instruction?

  10. Formative Assessment: Perie, Marion, Gong • Formative assessment is used by classroom teachers to diagnose where students are in their learning, where gaps in knowledge and understanding exist, and how to help teachers and student improve student learning. • The assessment is embedded within the learning activity and linked directly to the current unit of instruction. • The tasks presented may vary from one student to another depending on the teacher’s judgment.

  11. Formative Assessment: Perie, Marion, Gong • Providing corrective feedback, modifying instruction to improve the student’s understanding, or indicating areas of further instruction are essential aspects of a classroom formative assessment.

  12. Formative Assessment • …true meaning of formative assessment: an activity designed to give meaningful feedback to students and teachers and to improve professional practice and student achievement Reeves (2009). • Three things must occur for an assessment to be formative: assessment is used to identify students who are experiencing difficulty; those students are provided additional time and support to acquire the intended skill or concept, and the students are given another opportunity to demonstrate what they have learned (DuFour, Eaker, and Karhanek 2010).

  13. Formative Assessment • Formative assessment is a systematic process to continuously gather evidence and provide feedback about learning while instruction is underway. • The feedback identifies the gap between a student’s current level of learning and a desired learning goal. • Teachers elicit evidence about student learning using a variety of methods and strategies, e.g. observation, questioning, dialogue, demonstration, and written response. (Heritage, et al 2009)

  14. Formative Assessment • Teachers must examine the evidence from the perspective of what it shows about student conceptions, misconceptions, skills, and knowledge. • They need to infer the gap between students’ current learning and desired instructional goals, identifying students’ emerging understanding or skills so they can modify instruction. • For assessment to be formative, action must be taken to close the gap based on evidence solicited. (Heritage, et al 2009)

  15. Formative Assessment What We Know: • It is not a kind of test. • Formative Assessment practice, when implemented effectively, can have powerful effects on learning. • Formative Assessment involves teachers making adjustments to their instruction based on evidence collected, and providing students with feedback that advances learning. • Students participate in the practice of formative assessment through self and peer-assessment. (Heritage, 2011)

  16. Formative Assessment: Teacher’s Role • Effective when teachers are clear about the intended learning goals for a lesson: this means focusing on what students will learn, as opposed to what they will do. • Teachers need to share learning goal with students. • Teachers need to communicate the indicators of progress toward the learning goal. • There is no single way to collect formative evidence because formative assessment is not a specific kind of test. (Heritage, 2011)

  17. Formative Assessment: Student’s Role • Student’s role begins when they have a clear conception of the learning target. • In self-assessment, students engage in metacognitive activity which involves students in thinking about their own learning while they are learning. • They generate feedback that allows them to make adjustments to their learning strategies.

  18. Formative Assessment: Student’s Role • It is important to include peer-assessment where students give feedback to their classmates. • Students use the feedback; it is important that students have to both reflect on their learning and use the feedback advance learning. (Heritage, 2011)

  19. Formative Assessment: Evidence Collection • Whatever methods teachers use to elicit evidence of learning, it should yield information that is actionable by them and their students. • Evidence collection is a systematic process and needs to be planned so that teachers have a constant stream of information tied to indicators of progress. (Heritage, 2011)

  20. Formative Assessment: Feedback • Feedback obtained from planned or spontaneous evidence is an essential resource for teachers to shape new learning through adjustments in their instructions. • Feedback that teacher provides to students is also an essential resource so that student can take active steps to advance their own learning. (Heritage, 2011)

  21. Common Formative Assessment • Common assessment refers to those assessments given by teacher teams who teach the same content or grade level; those with “collective responsibiloity for the learning of a group of students who are expected to acquire the same knowledge and skills.” • No teacher can opt out of the process: common assessment use the same instrument or a common process utilizing the same criteria for determining the quality of student work. (DuFour et al., 2010)

  22. Common Formative Assessment: Benefits • Promote efficiency for teachers • Promote equity for students • Provide an effective strategy for determining whether the guaranteed curriculum is being taught and, learned. • Inform practice of individual teachers • Build a team’s capacity to improve its program • Facilitate a systematic, collective response to students who are experiencing difficulty • Tool for changing adult behavior and practice (Bailey & Jakicic, 2012)

  23. Formative Assessment: Table Discussion-Information Processing Do these views of Formative Assessment converge with your’s? Why/Why Not? What strikes you as most important given the key points regarding Formative Assessment? How do you see implementing assessment strategies that effectively utilize the crucial aspects of Formative Assessment? Are you comfortable designing Common Formative Assessments?

  24. Formative Assessment: Evidence to Action-G-Study • Heritage et al determined that there is little research to evaluate teachers’ ability to adapt instruction based on assessment of student knowledge and understanding. • Research has shown, using math, that moving from evidence to action may not always be the seamless process formative assessment demands. • G-study results provide data showing teachers do better at drawing inferences of student levels of understanding from assessment evidence, while having difficulties in deciding next instructional steps.

  25. Formative Assessment: Evidence to Action-G-Study • Heritage et al conducted a G-Study using 3 mathematical concepts as the instructional learning goal. • The teachers’ pedagogical knowledge in mathematics was the object of measurement. The study was designed to provide information about potential sources of variation in measuring teachers' pedagogical knowledge in mathematics. • The study design implies that there were three sources of score variability: rater, mathematics principle, and type of task.

  26. Formative Assessment: Evidence to Action-G-Study • Rater: Study used performance tasks; once source of variance was the possibility of score variation between raters due to interpretation and application of rubric and how stringent/lenient a rater may be. • Principle: different types of domain specific principles may cause variance due to a given teachers’ preparation, a teacher may have more knowledge about one principle than others. (Study evaluated 3 principles: distributive property, solving equations, & rational number equivalence.)

  27. Formative Assessment: Evidence to Action-G-Study Task: potential source of variability. Study focused on 3 types of tasks: identifying key principles; evaluating student understanding; and planning the next instructional step based on the evaluation of student understanding.

  28. Formative Assessment: Evidence to Action-G-Study Results • Main effect of principle and rater are minimal. Teachers knew the concept and knew how to evaluate the student work to determine learning. • Important finding: regardless of the math principle, determining next instructional steps based on the examination of student responses tends to be more difficult for teachers. • If teachers are not clear about what the next steps are to move learning forward, then promise of Formative Assessment to improve student learning is impacted negatively.

  29. Formative Assessment: Evidence to Action-G-Study Results & Teacher Support • Teachers need clear conceptions of how learning progresses in a domain; they need to know what precursor skills and understandings are for a specific instructional goal, what a good performance of the desired goal looks like, and how the skill increases in sophistication from the current level students have reached. • Learning progressions describe how concepts and skill increase in sophistication in a domain from the most basic to the highest level, showing the trajectory of learning along which students are expected to progress.

  30. Formative Assessment: Evidence to Action-G-Study Results & Teacher Support • From a learning progression, teachers can access the big picture of what students need to learn, they can grasp what the key building blocks of the domain are, while having sufficient detail for planning instruction to meet short term goals. • Teaches are able to connect Formative Assessment opportunities to short term goals as a means to track student learning. • Learning progressions alone will not be sufficient.

  31. Formative Assessment: Evidence to Action-G-Study Results & Teacher Support • Teachers need to know what a good performance of the their specific short-term learning goal looks like. They must also know that good performance does not look like. • Key finding: using assessment information to plan subsequent instruction tends to be the most difficult task for teachers as compared to other tasks. This finding gives rise to the question: can teachers always use formative evidence to effectively “form” action?

  32. Formative Assessment: Instructional Support-Marzano, Pickering, and Pollock (2001) 9 highly effective, research-based strategies • Identifying similarities and differences • Summarizing and note-taking • Reinforcing effort and providing recognition • Homework and practice • Nonlinguistic representations • Cooperative learning • Setting objectives & Providing Feedback • Generating & testing hypothesis • Cues, questions, and advanced organizers

  33. Formative Assessment: Table Discussion-Information Processing How can you use Professional Learning Communities to support teachers and students when implementing Formative Assessment? Based on Heritage et al’s findings, what type of professional development will you need to support Formative Assessment in your District’s?

  34. Formative Assessment: Validity Framework Review Nichols, Meyers, & Burling (2009) Validity Framework for a Formative System: • Figure 1: General Framework for Evaluating Validity • Figure 2: Framework using the identification of specific procedural errors • Figure 3: Framework for a system of reteaching Validity: The degree to which accumulated evidence and theory support specific interpretations of test scores entailed by proposed use of a test (JS Glossary).

  35. Formative Assessment: Design • Need to unwrap standards or evaluate State Testing Data to determine demonstrated areas of instructional need • Define & Align to State and District Expectations Review Baily Figure 4.1: 5 Steps • Focus on Key Words • Map it Out • Analyze the target • Determine big ideas • Establish Guiding Questions for Instruction

  36. Formative Assessment: Design • Assessments must provide information about important learning targets that are clear to students and teacher teams. • Assessments provide timely information for both students and teacher teams. • Assessment must provide information that tells students and teacher teams what to do next.

  37. Formative Assessment: Design • Determine Assessment types: selected response; constructed response; performance task • Determine number and balance of items • Select/design assessment • Administer, evaluate responses, and redesign instructional strategies

  38. Formative Assessment: Design • Assess again (in original assessment design, create enough items to have more than one assessment) • Evaluate depth and breath of rigor: Webb’s Depth of Knowledge/Bloom’s Taxonomy • Utilize PLC to support process and data discussions

  39. Cognitive Response Demands: Reading Load Reading Load: The amount and complexity of the textual and visual information provided with an item that an examinee must process and understand in order to respond successfully to an item (Ferrara 2011).

  40. Cognitive Response Demands: Reading Load Low Reading Load: May include a small amount of text. Moderate Reading Load: Lower amounts of text and visuals and less complex text. High Reading Load: May include a large amount of text much of which is complex linguistically or complex because of the content area concepts and terminology used (Ferrara, 2011).

  41. Cognitive Response Demands:Mathematical Complexity Ranges for NAEP Low Complexity: Items may require examinees to recall a property or recognize a concept; these are straightforward, single operation items. Moderate Complexity: May require examinees to make connections between multiple concepts, multiple operations, and to display flexibility in thinking as they decide how to approach a problem. High Complexity: May require examinees to analyze assumptions made in a mathematical model or to use reasoning, planning, judgment, and creative thought; it assumes that students are familiar with the mathematics concepts and skills required by an item (Ferrara, 2011).

  42. Common Core Transition: Academic Language Academic language is used to refer to the form of language expected in contexts such as the exposition of topics in the school curriculum, making arguments, defending propositions, and synthesizing information (Snow, 2010). (See Coxhead’s High-Incidence Academic Word List)

  43. Bailey & Jakicic Sample Assessment Plan

  44. Revised Item Map for Locally Developed Assessments-Formative Assessments Sample Item Map Template: Prior to Assessment Administration Sample Item Map Template: Post Assessment Administration *AL: Academic Language

  45. Multiple Measures-Joint Standards 13.7: In educational settings, a decision or characterization that will have major impact on a student should not be made on a simple test score. Other relevant information should be taken into account if it will enhance the overall validity of the decision.

  46. Multiple Measures Multiple Measures are intended to improve quality of high-stakes decision making so decisions are not based on one single measure. Definition of multiple measures, criteria to evaluate each measure, and how these measures should be combined for use in decision making is not clear. (Henderson-Montero, Julian, & Yen, 2003)

  47. Multiple Measures: Examples • Test more than one content area • Assess a content area using a combination of MC and CRQ formats • Assess a content area using an on-demand test and a class based portfolio (writing) • Assess school performance using a combination of academic tests and other indicators • Make progressively “higher stakes” decisions about schools using a combination of accountability scores and other reviews • Use for promotion/graduation processes by meeting certain criteria, even if student does not pass a State’s on-demand test (Gong & Hill, 2001).

  48. Multiple Measures: Examples • Have several assessment instruments that can be used by students of various proficiency or presentation/response needs • Allow students multiple opportunities to retake the test to determine whether they meet minimum cut scores • Allow for promotion/graduation • Double score every constructed response item on tests used for high school student accountability • Assess school performance using an average of at least two years’ of data • Assess school performance using as many grades of students as practical. (Gong & Hill, 2001)

  49. Multiple Measures • Need to create a framework for combining multiple measures. Four Categories of Rules: • Conjunctive, • Compensatory, • Mixed conjunctive-compensatory, and • Confirmatory (Henderson-Montero, Julian, and Yen 2003; Chester, 2003).

  50. Multiple Measures • Conjunctive: attainment of a minimum standard (e.g. meeting a designated cut score on a specific exam) • Compensatory: weaker performance on one measure can be offset by stronger performance on another. Performance on two or more measures is required before they are combined into a compensatory rule. • Mixed conjunctive-compensatory: uses a combination of conjunctive and compensatory approaches; e.g. minimal performance level is required across measures, but beyond minimal level of performance, poorer performance on one measure can be counterbalanced by better performance on other measures.