1 / 33

Northborough-Southborough DDM Development

Northborough-Southborough DDM Development. October 9, 2014 Dr. Deborah Brady. Northborough-Southborough. DDM 1 - MCAS (SGP) For teachers who receive a SGP from MCAS (grades 4-8 for ELA and Math only)

Download Presentation

Northborough-Southborough DDM Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Northborough-Southborough DDM Development October 9, 2014 Dr. Deborah Brady

  2. Northborough-Southborough • DDM 1 - MCAS (SGP) • For teachers who receive a SGP from MCAS (grades 4-8 for ELA and Math only) • The District is only required to use median Student Growth Percentiles (SGP) from one MCASarea per teacher. • In the first year, the K-5 DDM will focus only on MCAS ELA. • In grades 6-12, the MCAS focus may be either math or ELA. • The DDM rating is based on the SGP (student growth) and not the scaled scores (student achievement). • DDM 1 - Common Assessment • For teachers who do not receive a SGP from MCAS: • Teachers will develop grade level/course common assessments utilizing a pre and post assessment model. • DDM 2 - Common Assessment • For all teachers: • Teachers will develop grade level/course common assessments utilizing a pre- and post-assessment model

  3. Goal: 2014-2015(DDMs must be negotiated with our Associations) • Content Student Learning DDMs • *Core Content Areas • (Core areas: math, English, science, and social studies) • Year 1: Identify first two (of four) unique DDM data elements • Alignment of DDM’s with Massachusetts Curriculum Frameworks • Identify/develop DDMs by common grades (K-12) and content • Create rubric • Collect first year of data • Year 2: Identify second two (of four) unique or utilize 2014-2015 DDM’s • (same assessment different students) • Note: Consumer science, applied arts, health & physical education, business education, world language and SISPs – Received a one year waiver • Planning: Identify/develop DDMs for 2015-2016 implementation • Collect first year of data 2015-2016

  4. Core DDMs

  5. Quality Assessments • Substantive • Aligned with standards of Frameworks, Vocational standards • And/or local standards • Rigorous • Consistent in substance, alignment, and rigor • Consistent with the District’s values, initiatives, expectations • Measures growth (to be contrasted with achievement) and shifts the focus of teaching

  6. Scoring Student Work • Districts will need to determine fair, efficient and accurate methods for scoring students’ work. • DDMs can be scored by the educators themselves, groups of teachers within the district, external raters, or commercial vendors. • For districts concerned about the quality of scoring when educators score their own student’s work, processes such as randomly re-scoring a selection of student work to ensure proper calibration or using teams of educators to score together, can improve the quality of the results. • When an educator plays a large role in scoring his/her own work, a supervisor may also choose to include the scoring process into making a determination of a Student Impact.

  7. Some Possible Common Exam Examples • A Valued Process: PORTFOLIO: 9-12 ELA portfolio measured by a locally developed rubric that assesses progress throughout the four years of high school • K-12 Writing or Writing to Text: A district that required that at least one DDM was “writing to text” based on CCSS appropriate text complexity • Focus on Data that is Important: A HS science department assessment of lab report growth for each course (focus on conclusions) • “New CCSS” Concern: A HS science department assessment of data or of diagram or video analysis

  8. More • CCSS Math Practices: A HS math department’s use of PARCC examples that require writing asking students to “justify your answer” • SS Focus on DBQs and/or PARCC-like writing to Text: A social studies created PARCC exam using as the primary sources. Another social stuies department used “mini-DBQs” in freshman and sophomore courses • Music: Writing about a concert • Common Criteria Rubrics for Grade Spans: Art (color, design, mastery of medium), Speech (developmental levels)

  9. More • Measure the True Goal of the Course: Autistic and behavioral or alternative programs and classrooms, Social-emotional development of independence (whole collaborative—each educator is measuring) • SPED “Directed Study” Model—now has Study Skills explicitly recorded by the week for each student and by quarter on manila folder: Note taking skills, text comprehension, reading, writing, preparing for an exam, time management, and differentiated by student • A Vocational School’s use of Jobs USA assessments for one DDM and the local safety protocols for each shop

  10. Assessing Math Practices Communicating Mathematical Ideas • Clearly constructs and communicates a complete response based on: • a response to a given equation or system of equations • a chain of reasoning to justify or refute algebraic, function or number system propositions or conjectures • a response based on data How can you assess these standards?

  11. Demonstrating Growth Billy Bob’s work is shown below. He has made a mistake In the space to the right, solve the problem on your own on the right. Then find Billy Bob’s mistake, circle it and explain how to fix it. Finding the mistake provides students with a model. Requires understanding. Requires writing in math. Your work Explain the changes that should be made in Billy Bob’s Work Billy Bob’s work ½X -10 = -2.5 +10 = +10 _____________________________________________ ½ X +0 = +12.5 (2/1)(1/2)X =12.5 (2) X=25

  12. A small step? A giant step? The district decides Which of the three conjectures are true? Justify your answer A resource for DDMs. Determine if each of Michelle’s three conjectures are true. Justify each answer.

  13. Rubrics and grading: numbers good or a problem?

  14. Objectivity versus SubjectivityCalibration Insightful and deep understanding • Human judgment and assessment • What is objective about a multiple choice test? • Calibrating standards in using rubrics • Common understanding of descriptors • What does “insightful,” “In-depth,” “general” look like? • Use exemplars to keep people calibrated • Assess collaboratively with uniform protocol General Details Many Misconceptions

  15. Consistency in Directions for Administrating Assessments • Directions to teachers need to define rules for giving support, dictionary use, etc. • What can be done? What cannot? • “Are you sure you are finished?” • How much time? • Accommodations and modifications?

  16. Qualitative Methods of Determining an Assessment’s VALIDITY • Looking at the “body of the work” • Validating an assessment based upon the students’ work • Floor and ceiling effect • If you piled the gain scores (not achievement) into High, M, and Low gain • Is there a mix of at risk, average, and high achievers mixed throughout each pile or can you see one group mainly represented

  17. Low, Moderate, High Growth Validation • Did your assessment accurately pinpoint differences in growth? • Look at the LOW pile If you think about their work during this unit, were they struggling? • Look at the MODERATE pile. Are these the average learners who learn about what you’d expect of your school’s student in your class? • Look at the HIGH achievement pile. Did you see them learning more than most of the others did in your class? • Based on your answers to 1, 2, and 3, • Do you need to add questions (for the very high or the very low?) • Do you need to modify any questions (because everyone missed them or because everyone got them correct?)

  18. Psychometric process called Body of the Work validation Look at specific students’ work • Tracey is a student who was rated as having high growth. • James had moderate growth • Linda had low growth • Investigate each student’s work • Effort • Teachers’ perception of growth • Other evidence of growth • Do the scores assure you that the assessment is assessing what it says it is?

  19. Objectivity versus SubjectivityMultiple Choice Questions • Human judgment and assessment • What is objective about a multiple choice test? • What is subjective about a multiple choice test? • Make sure the question complexity did not cause a student to make a mistake. • Make sure the choices in M/C are all about the same length, in similar phrases, and clearly different

  20. Rubrics and Inter-Rater ReliabilityGetting words to mean the same to all raters

  21. Protocol for Developing Inter Rater Reliability • Before scoring a whole set of papers, develop Inter-rater Reliability • Bring High, Average, Low samples (1 or 2 each) (HML Protocol) • Use your rubric or scoring guide to assess these samples • Discuss differences until a clear definition is established • Use these first papers as your exemplars • When there’s a question, select one person as the second reader

  22. Annotated Exemplar How does the author create the mood in the poem? Answer and explanation in the student’s words The speaker’s mood is greatly influenced by the weather. The author uses dismal words such as “ghostly,” “dark,” “gloom,” and “tortured.” Specific substantiation from the text

  23. “Growth Rubrics” May Need to Be Developed

  24. Protocols to Use with Implemented Assessments • Floor and Ceiling Effects • Validating the Quality of Multiple Choice Questions • Inter-Rater Reliaibility with Rubrics and Scoring guides • Low-Medium-High Looking at Student Work Protocol (calibration, developing exemplar, developing action plan)

  25. FAQ from DESE • Do the same numbers of students have to be identified as having high, moderate, and low growth? There isno set percentage of students who need to be included in each category. Districts should set parameters for high, moderate, and low growth using a variety of approaches. • How do I know what low growth looks like? Districts should be guided by the professional judgment of educators. The guiding definition of low growth is that it is less than a year’s worth of growth relative to academic peers, while high growth is more than a year’s worth of growth. If the course meets for less than a year, districts should make inferences about a year’s worth of growth based on the growth expected during the time of the course. • Can I change scoring decisions when we use a DDM in the second year? It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time. • Will parameters of growth be comparable from one district to another? Different assessments serve different purposes. While statewide SGPs will provide a consistent metric across the Commonwealth and allow for district-to-district comparisons, DDMs are selected

  26. Calculating Scores What you need to understand as you are creating assessments

  27. 288 to 244/ 25 SGP 4503699 230 to 230/ 35 SGP 214 to 225/ 92 SGP

  28. 248 to 244/ 25 SGP 4503699 230 to 230/ 34 SGP 214 to 225/ 92 SGP

  29. Median student growth percentile Median SGP for the 6th grade class Imagine that the list of students to the left are all the students in your 6th grade class. Note that they are sorted from lowest to highest SGP. The point where 50% of students have a higher SGP and 50% have a lower SGP is the median.

  30. Sample Cut Score Determination (for local assessments)

  31. Important Perspective It is expected that districts are building their knowledge and experience with DDMs. DDMs will undergo both small and large modifications from year to year. Changing or modifying scoring procedures is part of the continuous improvement of DDMs over time. We are all learners in this initiative.

  32. Next Steps Today • Begin to Develop Common Assessments • Consider Rigor and Validity (Handout Rubrics) • Develop Rubric (Consider scoring concerns) • Develop Common Expectations for Directions (to Teachers) Other Important Considerations: • Consider when assessments will be given • The amount of time they will take • The impact on the school

  33. Handout Rubrics • Bibliography—Sample exams; sample texts • Rubrics • Types of questions (Multiple choice, essay, performance • Reliability • Will you design 2 exams, pre- and post- • Ultimate validity • Does it assess what it says it does? • How does it relate to other data • Step-by-step, precise considerations (DESE) • Quality Rubric (all areas) • Protocol for determining growth scores

More Related