1 / 40

Leominster Public Schools District Determined Measures

Leominster Public Schools District Determined Measures. Dr. Deborah A. Brady Ribas Associates, Inc. Agenda. First Hour . Second Hour. Job alike groups and departments work together Beth Pratt and Deb Brady will go from group to group

abner
Download Presentation

Leominster Public Schools District Determined Measures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. LeominsterPublic SchoolsDistrict Determined Measures Dr. Deborah A. Brady Ribas Associates, Inc.

  2. Agenda First Hour Second Hour Job alike groups and departments work together Beth Pratt and Deb Brady will go from group to group Product: Facilitator hand in any unanswered questions or ??? • Overview of District Determined Measures • The Timeline • Quality Assessments • Tools from DESe • Resources • Rubrics • Core Content Objectives

  3. Today’s objectives • By the end of the workshop, participants will: • Understand the quality expectations and assessment criteria for DDM assessments • Begin to draft a schedule for this year for your team or department • Begin the process of developing DDMs by (if there is time) • Using the Quality Tracking Tool on at least one possible DDM • Using the Educator Alignment tool to consider the local assessment needs • Email or send hardcopy of your group’s meeting minutes • Include progress • Remaining questions • What you will need to be successful

  4. Resource Materials on linefor today and afterhttp://writingtotextbrady.wikispaces.com/Leominster+Assessment+Workshop • DESE Tools • Quality Tracking Tool (Excel file) • Educator Assessment Tool (Excel file) • Core Curriculum Objectives (CCOs) • Example Assessments (mainly commercial; some local) • Model Curriculum Units with • Rubrics (Curriculum Embedded Performance Assessments) • Rubrics: Cognitive Rigor Matrices: Reading, Writing, Math, Science • Research • NY and NYC • Achieve.org, PARCC, and many others

  5. The Big Picture

  6. Pilot Year SY2014 SEPTEMBER DESE received B-R’s Plan for • Early grade literacy (K-3) • Early grade math (K-3) • Middle grade math (5-8) • High school “writing to text” (PARCC multiple texts) • PLUS one more non-tested course, for example: • Fine Arts • Music • PE/Health • Technology • Media/Library • Or other non-MCAS growth courses including grade 10 Math and ELA, Science DECEMBER: Implementation Extension Request Form for specific courses in the JUNE PLAN BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator The scores will not count for those who pilot DDMs in 2014.

  7. SY 2015 YEAR 2 The scores will count as the first half of the “impact score” with the waivered courses as the only exception • All professional personnel will be assessed with 2 DDMs, at least one local: • Guidance • Principals, Assistant Principals • Speech Therapists • School Psychologists • Nurses • All teachers not yet assessed; general and special education The scores will count as the first half of the “impact score.”

  8. Year 3 SY2016 “Impact Ratings” will be given to all licensed educational personnel and sent to DESE • Two measures for each educator • At least one local measure for everyone • Some educators will have two local measures • Locally determined measures can include Galileo, DRA, MCAS-Alt • The MCAS Growth Scores can be one measure • The average of two years’ of scores • And a two-year trend “Impact Ratings” Are based upon two years’ growth scores for two different assessments, one local.

  9. DESE is still rolling out the evaluation process and District Determined Measures 4 3 2 1

  10. DDM Impact 2014 From the Commissioner: “Finally, let common sense prevail when considering the scope of your pilots. “I recommend that to the extent practicable, districts pilot each potential DDM in at least one class in each school in the district where the appropriate grade/subject or course is taught. “There is likely to be considerable educator interest in piloting potential DDMs in a no-stakes environment before year 1 data collection commences, so bear that in mind when determining scope.”

  11. Everyone earns two ratings Exemplary Proficient Needs Improvement Unsatisfactory High Moderate Low Summative Performance Rating Impact Rating on Student Performance 11 *Most districts will not begin issuing Impact Ratings before the 2014-2015 school year. Massachusetts Department of Elementary and Secondary Education

  12. Student Impact Rating Determines Plan Duration Impact Rating on Student Performance Massachusetts Department of Elementary and Secondary Education

  13. 244/ 25 SGP 4503699 230/ 35 SGP 225/ 92 SGP

  14. A Variety of Assessment TypesQuality of many purchasable and a few local are available online at http://www.doe.mass.edu/edeval/ddm/example/ Types • On Demand (timed and standardized) • Mid-Year and End-of-Year exams • Projects • Portfolios • Capstone Courses • Unit tests Formats • Multiple choice • Constructed response • Performance (oral, written, acted out)

  15. Acceptable Assessments • MCAS Growth Scores can serve as one score for (ELA, Math4-8; not 3, not HS) • MCAS Growth Scores must be used when available, but all educators will have 2 different measures • The MA Model Units Rubrics can be used (online for you) • Galileo • BERS-2 (Behavioral Rating Scales) • DRA (Reading) • Fountas and Pinnell Benchmark • DIBELS (Fluency) • MCAS-Alt • MAP

  16. “The task predicts performance”Elmore Why (beyond evaluation impact) determining these measures is important to every educator

  17. Development Considerations (Today) Assessment Quality • Validity • Reliability • Rigor • Scoring Guides • Inter-rater reliability • You will receive tools for these areas today

  18. Implementation Considerations(Later workshop) • Calibration of Scorers • Developing assessment protocols • Are all assessments of equally appropriate rigor K-12? Integrity of scores • “Assessment creep” • Training assessors • Time • Tabulating growth scores from student scores • Organizing and storing scores

  19. Can this be an opportunity? Capitalize on what you are already doing • Writing to text 9-12? K-12? • Research K-12? Including Specialists? • Art, Music, PE, Health present practices • Math—one focus K-12? • “Buy, borrow, or build your own” DESE

  20. Tools to Facilitate the Work Tools to assess Alignment Tools to assess Rigor Tools to assess the quality of student work

  21. Tracking Tool:Two Essential Quality Considerations Alignment Rigor • Alignment toCommon Core,PARCC, and the District Curriculum • Shifts for Common Core have been made: • Complex texts • Multiple texts • Argument, Info, Narrative • Math Practices • Depth over breadth

  22. Reliability and ValidityConsiderations Reliability • Internal Consistency • Test-retest • Alternate forms/split half • Inter-rater reliability • 0 to 1 rating for Reliability • None to 100% Validity • Are you measuring what you intend to assess • Content (=curriculum) • Consequential Validity—good or bad impact • Does this assessment narrow the curriculum? • Relationships (to SAT, to grades) • Correlation measurement • -1 to +1 ratings

  23. Reliable: Results in the same area (consistency) Valid: hits the target of your goals (measures what you teach)

  24. Teacher Alignment Tool Purpose: 2 DDMs per Educator

  25. Selecting DDMs “Borrow, Buy, or Build” • PRIORITY:Use Quality Tool to Assess Each Potential DDM to pilot this year for your school (one district final copy on a computer) • CCOs will help if this is a District-Developed Tool • If there is additional time, Use Educator Assessment Tool to begin to look at developing 2 assessments for all educators for next year

  26. The first entry point for selecting DDMs • Is the measure aligned to content? • Does it assess what is most important for students to learn and be able to do? • Does it assess what the educators intend to teach? (VALIDITY)

  27. Core Curriculum Objectives(CCOs—partial list for Writing to Text)

  28. ELA-Literacy — 9 English 9-12https://wested.app.box.com/s/pt3e203fcjfg9z8r02si Assessment Hudson High School Portfolio Assessment for English Language Arts and Social Studies Publisher Website/Sample Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.

  29. Sample DDMs from DESE • Buy, Borrow, Build • Each sample DDM is evaluated Hudson’s Evaluation: Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments. • Many are standardized assessments

  30. Second in importance: • Is the measure informative? • Do the results of the measure inform educators about curriculum, instruction, and practice? • Does it provide valuable information to educators about their students? • Does it provide valuable information to schools and districts about their educators?

  31. Approaches to Measuring Student Growth • Pre-Test/Post Test • Repeated Measures (running records) • Holistic Evaluation (portfolio) • Post-Test Only (only when assessment lacks norm like AP use as baseline)

  32. More Tools to Ground the Process For Assessing Rigor and Alignment • Daggett’s Rigor/Relevance Scale • DESE’s Model Curriculum (Understanding by Design) • Curriculum Embedded Performance Assessments from MA Model Curriculum • PARCC’s Task Description • PARCC’s Rubrics for writing

  33. PARCC: Rationale for Writing to Text

  34. Other Tools: MA Model Curricula and Rubrics

  35. Cognitive Rigor Matrix for Reading, Writing, Science, Math

  36. RESEARCH what Other States Are Doing • New York State and New York City examples • Portfolio (DESE Approved from Hudson PS) • Connecticut: Specific tasks (Excellent for the Arts, Music) • PARCC question and task prototypes http://www.parcconline.org/samples/item-task-prototypes

  37. Job-Alike Groups(1 hour) Purpose • Discuss possible assessments • Consider what you need to accomplish this year using Schedule and Checklist • Use Quality Tracking Tool on one assessment to understand how it supports your district, school, department • Look at Educator Alignment tool to consider the “singletons” that may need to be addressed in your district, school, department Product Email or hard copy to Beth Pratt with minutes of your group’s meeting that may consider or be working on • Assessments that you are working on • Next steps • What you need to be successful

  38. NEXT STEPS: Five Considerations Measure growth Employ a common administration procedure  Use a common scoring process Translate these assessments to an Impact Rating Assure comparability of assessments (rigor, validity).

More Related