1 / 48

Formative Assessment Year 3 and Beyond TAP Schools

Formative Assessment Year 3 and Beyond TAP Schools. December 11, 2009. Welcome. Welcome and Overview Sheila Talamo, TAP Director The World According to Mr. Rogers by Fred Rogers. Objectives.

iain
Download Presentation

Formative Assessment Year 3 and Beyond TAP Schools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formative AssessmentYear 3 and Beyond TAP Schools December 11, 2009

  2. Welcome • Welcome and Overview • Sheila Talamo, TAP Director • The World According to Mr. Rogersby Fred Rogers

  3. Objectives • Deepen understanding of the role of the Leadership Team in supporting and monitoring the analysis of student work in the field tests and in cluster. • Deepen understanding of the process of establishing criteria, developing measurement tools, and creating formative assessments that are based on the criteria and aligned to the testing format. • Develop a working knowledge, using formative assessments to track cluster progress and pinpoint specific student difficulties.

  4. Agenda • Welcome • Formative Assessment Review • Backwards Design Model (I DO) • Refine Criteria (We Do) • Schools Establish Criteria (You Do) • When Students Track Their Progress • Tracking Data • Development • Book Talks • Evaluation/Closing

  5. Questions for my EMT Due to time constraints please: • Jot down the questions you are having as we go through the training • You might discover the answer to your question later in the day • EMTs will give individualized support/consultation with your questions Handout

  6. Let’s Review • Effective Use of Formative Assessments: Classroom vs. Cluster • Planning for Assessment of Student Learning • Leadership Team’s Role Regarding Formative Assessments

  7. Classroom/Cluster ConnectionClassroomCluster

  8. What do you consider when planning your assessment of student learning? • AllWrite RoundRobin • Stand-N-Share

  9. When planning assessment of student learning I consider… • Criteria • Skills • Alignment • Formative Measures • Summative Measures • Student Characteristics • Tracking & Utilizing Student Data

  10. What is the Leadership Team’s Role in Establishing Criteria and Tracking Formative Assessment ? RoundTable

  11. Leadership Team’s Role: Establishing Criteria and Scoring Guides • Field Testing information is continuously brought to the Leadership Team before presenting at cluster. • The Leadership Team monitors and supports the development of criteria and formative assessments.

  12. Leadership Team’s Role: Tracking Formative Assessments • The Leadership Team monitors and supports the analysis and tracking of student work • After the pre-test is administered and analyzed, the Leadership Team supports the development of the Student IGP Goal

  13. Leadership Team’s Role: Tracking Formative Assessments • Based upon the progress of data, the Leadership Team may have to assist the master and mentor teachers in refining the criteria and/or detecting sub-skill problems • The Leadership Team is the Master & Mentor teachers’ cluster

  14. What is the purpose in setting a Student IGP Goal? RoundRobin

  15. Setting the IGP Student Goal • The cluster cycle goal mirrors the Master/Mentor’s IGP goal. • Master’s/Mentor’s IGP guides the cluster long range plan

  16. Setting the IGP Student Goal • The origin of the Student IGP Goal is the field test pre-test data • Progress toward the IGP goal helps determine when to post-test • Leadership team decides when to administer the field test post-test that ends the cycle Handout

  17. Backwards Design Model (I Do) • Identify Need/Skill… -What is the defined need? -Use various data sources (LEAP, iLEAP, GEE, Benchmark Tests, District Assessments, Dibels, etc.) -What GLE(s) will we target? -GLE(s) appropriate for content area based on biggest needs -What sub-skills/prerequisite skills that are critical in order to address the targeted GLE(s)? Handout

  18. Backwards Design Model (I Do) • Select Strategy -What strategy will assist students in mastering the targeted GLE(s)/identified need? -Research & bring possible strategies to the Leadership team for consideration. • Establish Criteria/Develop Scoring Guide -What does mastery of the targeted GLE(s)/identified need look like? -Test the students’ knowledge and application of the skill not the strategy. -Language within the criteria will change based on content.

  19. Mix-Pair-ShareRally Robin What are some things you should consider when creating a scoring guide?

  20. Questions to Consider When Creating/Evaluating a Scoring Guide • Does the scoring guide relate to the outcome being measured? • Is the scoring guide free from irrelevant skills/sub skills and information? • Are the descriptors and scoring levels well defined? • Do the criteria align to high stakes testing? • Can teachers and students understand and consistently apply the scoring guide? • Is the scoring guide developmentally appropriate? • Does it reflect teachable skills or the strategy? • Will it provide the kind of information you need and can use effectively? Handout

  21. Backwards Design Model (I Do) • Develop Pre-Test -How will the pre-test be aligned to the criteria and the testing format? -Set parameters for administering the pre-test • Determine Field Test Group • Conduct Pre-Test

  22. Backwards Design Model (I Do) • Use Pre-Test Data to Refine Criteria -After examining the pre-test results, are there any grey areas within the scoring guide that must be clarified and resolved? -It is okay to change/adjust if needed • Set a Student IGP Goal -Based on the field test pre-test results, what will the Student IGP Goal be? -Set a realistic goal where all students must show growth -This goal tells you when to end the cycle

  23. Backwards Design Model (I Do) • Chunk/Segment Strategy -As you field test this may need to be adjusted or modified • Develop Formative Assessments for each chunk, aligned to criteria and sometimes aligned to testing format *see handout • Field Test Each Chunk Handout

  24. Break

  25. VIDEO Alice Harte: Master teachers creating the criteria

  26. RoundRobin • Why is it important to use student work samples when refining the criteria/scoring guide? • Why is it important to include the whole leadership team in this process?

  27. Establishing Criteria (We Do) • 3rd Grade Work Samples • Tessie Bell, Alice Harte Master Teacher • 3CR Strategy

  28. Establishing Criteria (We Do) • Number Off • One Stray

  29. Establishing Criteria (We Do) Using Tessie Bell’s 3rd Grade student work samples from Alice Harte: • We will refine theProgressing to Criteria section under the Restatementportion of the generic scoring guidetogether. • Our mission is to define the scoring levels to eliminate any grey areas teachers may encounter when scoring student work. Handout

  30. Lunch

  31. Establishing Criteria (You Do) Using Tessie Bell’s 3rd Grade student work samples from Alice Harte: • Table groups will refine the Progressing to Criteriasection under the Details portion of the criteria within the scoring guide • Refer to the GLEs and guiding questions. • Sort the samples into HML, according to characteristics, and use them to refine the scoring guide. Handout

  32. What criteria did your table refine?

  33. When Students Track Their Progress by Robert Marzano • Providing teachers with graphic displays of students’ scores on formative assessments were associated with a 26 percentile point gain • Students tracking their own progress using graphic displays, gains are even higher Handout

  34. Fourteen Studies… Two different settings, same content taught for the same length of time: • Students tracked their progress • Students did not track their progress On average, students tracking their own progress was associated with a 32 percentile point gain in achievement.

  35. What information is provided for students? • First, the rubric provides a description of the levels of performance • Second, tracking progress provides a representation of each student’s progression of learning

  36. Best results obtained when: • Addressing a single goal in all the assessments • Use rubrics instead of points • Use different types of assessment

  37. Tracking Data • Formative Assessments ARE the student data in a cluster cycle • Before data is tracked the criteria, scoring guide, and pre-test are presented to cluster • Data must be analyzed both quantitatively and qualitatively in order to thoroughly track students’ progress

  38. Cluster Skill-Cycle Tracking Chart Handout Growth Formula: Increase (Post-Test Score – Pre-Test Score = ___) / Pre-Test Score = Percentage Growth *Exception: With a score of 0 on Pre-Test, % growth IS Post-Test score. To calculate a DECREASING score use formula above, but % will be in negative terms.* Qualitative Data

  39. Cycle-Skill Student Tracking Chart Handout Growth Formula: Increase (Post-Test Score – Pre-Test Score = ___) / Pre-Test Score = Percentage Growth *Exception: With a score of 0 on Pre-Test, % growth IS Post-Test score. To calculate a DECREASING score use formula above, but % will be in negative terms.*

  40. Grade/Content _________ Skill ____________ Strategy _________________ Date & Color of Modifications: ___________________________________ Handout Qualitative Data

  41. Tracking Data Video • Present, chart, and analyze data (Quantitative) • Defining and categorizing characteristics of student work samples using the HML chart (Qualitative) Handout Sandra Walker-Parker Trenise Duvernay

  42. Tracking Reminders • Present, chart, and analyze student work samples (Formative Assessments) for each chunk. • Revisit HML chart (add & scratch through). • Continuously connect back to IGP student goal to monitor progression toward goal. • Use the data to pin-point specific problems with sub-skills and individual students and determine next steps.

  43. Checking for Understanding • Formative assessments are ALWAYS aligned with and measured against the criteria and scoring guide. • Formative assessments need to be a mixture of test items formatted and aligned to the high stakes test and test items where students must demonstrate their knowledge in other ways. • Quantitative and qualitative data are extracted from formative assessments.

  44. Checking for Understanding • The data from the formative assessments guides the progress of the cluster cycle. • The data should be frequently measured against the Student IGP Goal in order to determine when to post-test. • The data is also used to guide all decisions, determine the success of a strategy, and to pinpoint specific student needs.

  45. Rally Coach Handout

  46. Evaluate Your Scoring Guide • Does the scoring guide relate to the outcome being measured? • Is the scoring guide free from irrelevant skills/sub skills and information? • Are the descriptors and scoring levels well defined? • Do the criteria align to high stakes testing? • Can teachers and students understand and consistently apply the scoring guide? • Is the rubric developmentally appropriate? • Does it reflect teachable skills or the strategy? • Will it provide the kind of information you need and • can use effectively? Handout

  47. Book Talks • Developing Minds by Costa (Nicole) • The Power of Retelling by Benson (Vicky)

  48. Evaluation & Closure

More Related