Marshfield public schools district determined measures
This presentation is the property of its rightful owner.
Sponsored Links
1 / 68

Marshfield Public Schools District Determined Measures PowerPoint PPT Presentation


  • 106 Views
  • Uploaded on
  • Presentation posted in: General

Marshfield Public Schools District Determined Measures. Dr. Deborah A. Brady Ribas Associates, Inc. Do Now*. Please create a name tag or a “name tent” with your first name and school or department. Read the Table of Contents on page 1. fff Respond to the DO Now on page 2 of your handout.

Download Presentation

Marshfield Public Schools District Determined Measures

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Marshfield public schools district determined measures

Marshfield Public SchoolsDistrict Determined Measures

Dr. Deborah A. Brady

Ribas Associates, Inc.


Do now

Do Now*

  • Please create a name tag or a “name tent” with your first name and school or department.

  • Read the Table of Contents on page 1. fff

  • Respond to the DO Now on page 2 of your handout.

    *The materials are on line

    if you want to follow along

    and add notes at

    http://tinyurl.com/l7287z9


The scope of the work

The SCOPE of the Work

PLANNING


New timetable changes in handout on page 3

New Timetable (changes in handout on page 3)


Dese is still rolling out the evaluation process and district determined measures

DESE is still rolling out the evaluation process and District Determined Measures

4

3

2

1


New dese support for teacher evaluation and alignment to the common core

NEW DESE Support for Teacher Evaluation and Alignment to the Common Core

  • Sample DDMs in the five required pilot areas

  • Technical Assistance and Networking sessions on September 19th

  • Technical Guide B (in this PowerPoint) addresses the practical application of assessment concepts to piloting potential DDMs and measuring student growth.

  • Model collective bargaining language

  • An ongoing Assessment Literacy webinar series

  • Guidance on constructing local growth scores and growth models will be released.

  • Guidance on determining Student Impact Rating will be released.


Support from dese

Support from DESE

  • Additional Model Curriculum Units, which include curriculum-embedded performance assessments CEPAs

  • Guidance on the use of CEPAs as part of a DDM-strategy.

  • Professional development for evaluators on how to focus on shifts embedded in the new ELA and math Curriculum Frameworks during classroom observations.

  • Professional development for evaluators on how to administer and score DDMs and use them to determine high, moderate or low growth, focused on the five required DDM pilot areas.

  • A Curriculum Summit in November


Ddm impact 2014

DDM Impact 2014

  • Take advantage of a no-stakes pilot year to try out new measures and introduce educators to this new dimension of the evaluation framework.

  • Districts are strongly encouraged to expand their pilots beyond the five required pilot areas.

  • Fold assessment literacy into the district's professional development plan to stimulate dialogue amongst educators about the comparative benefits of different potential DDMs the district could pilot.

  • Consider how contributing to the development or piloting of potential DDMs can be folded into educators' professional practice goals.


Ddm impact 20141

DDM Impact 2014

From the Commissioner:

“Finally, let common sense prevail when considering the scope of your pilots.

“I recommend that to the extent practicable, districts pilot each potential DDM in at least one class in each school in the district where the appropriate grade/subject or course is taught.

“There is likely to be considerable educator interest in piloting potential DDMs in a no-stakes environment before year 1 data collection commences, so bear that in mind when determining scope.”


Pilot pilot year sy2014

2014

Pilot Pilot Year SY2014

PILOT YEAR

SEPTEMBER provide DESE a tentative plan for:

  • Early grade literacy (K-3)

  • Early grade math (K-3)

  • Middle grade math (5-8)

  • High school “writing to text” (PARCC multiple texts)

  • PLUS one more non-tested course, for example:

    • Fine Arts

    • Music

    • PE/Health

    • Technology

    • Media/Library

    • Other non-MCAS growth courses including grade 10 Math and ELA, Science

      DECEMBER—Implementation Extension Request Form for specific courses in the JUNE PLAN

      BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator

The scores will not count for those who pilot DDMs in 2014.


Sy 2015

SY 2015

  • All professional personnel will be assessed with 2 DDMs, at least one of them will be locally determined:

    • All teachers

    • Guidance

    • Principals, Assistant Principals

    • Speech Therapists

    • School Psychologists

    • Nurses

      EXCEPT those waivered by DESE based on a case-by-case decision process.

The scores will count as the first half of the “impact score” with the waivered courses as the only exception


Sy2016

SY2016

  • “Impact Ratings” will be given to all licensed educational personnel and sent to DESE

    • Two measures for each educator

    • At least one locally determined measure for everyone

    • Some educators will have two locally determined measures

    • The locally determined measure can be a standardized test such as the DRA, MAP, Galileo, etc.

    • The MCAS can be only one measure

    • The average of two years’ of scores

“Impact Ratings”

Are based upon two years’ growth scores for two different assessments, at least one non-MCAS score that is locally determined.


Marshfield public schools district determined measures

Every educator earns two ratings

Exemplary

Proficient

Needs Improvement

Unsatisfactory

High

Moderate

Low

Summative

Performance Rating

Impact Rating

on

Student

Performance

13

*Most districts will not begin issuing Impact Ratings before the 2014-2015 school year.

Massachusetts Department of Elementary and Secondary Education


Marshfield public schools district determined measures

Student Impact Rating Determines Plan Duration

Impact Rating

on

Student

Performance

Massachusetts Department of Elementary and Secondary Education


Acceptable standardized but still considered district determined assessments

Acceptable (Standardized, but still considered District Determined) Assessments

  • MCAS can serve as one score for (ELA, Math, Science)

  • One or two locally developed assessments; some educators may have three

  • DESE Exemplars for the required piloted areas will be available in August 2013

  • The MA Model Units Rubrics can be used

  • Galileo

  • BERS-2 (Behavioral Rating Scales)

  • DRA (Reading)

  • Fountas and Pinnell Benchmark

  • DIBELS (Fluency) ???

  • MCAS-Alt

  • MAP

  • AP


A variety of assessment types

A Variety of Assessment Types

  • On Demand (timed and standardized)

  • Mid-Year and End-of-Year exams

  • Projects

  • Portfolios

  • Capstone Courses

  • Unit tests

  • Other

Formats can include:

  • Multiple choice

  • Constructed response

  • Performance (oral, written, acted out)


What kinds of assessments will work for administrators guidance nurses school psychologists

What kinds of assessments will work for administrators, guidance, nurses, school psychologists?

  • Use School-wide Growth Measures

  • Use MCAS and extend it to all educators in a school

  • Use “indirect measures” such as dropout rates, attendance, etc., as measures

  • Use Student Learning Objectives (SLOs)?

  • Team-based SLOs?

  • Or create measures.

  • A pre- and post- test are required to measure growth


Marshfield public schools district determined measures

GROWTH SCORES for Educators Will Need to Be Tabulated

for All Locally Developed Assessments

MCAS SGP

244/ 25 SGP

4503699

230/ 35 SGP

225/ 92 SGP


According to technical guide b summarized on page 31 of handout focus on the following

According to Technical Guide B (Summarized on page 31 of handout)Focus on the Following:

  • Is the measure aligned to content?

  • Is the measure informative?


Entry point to ddm work two focus questions

Entry point to DDM work: Two Focus Questions

  • Is the measure aligned to content?

    • Does it assess what is most important for students to learn and be able to do?

    • Does it assess what the educators intend to teach?


Entry point to ddm work two focus questions1

Entry point to DDM work: Two Focus Questions

  • Is the measure informative?

    • Do the results of the measure inform educators about curriculum, instruction, and practice?

    • Does it provide valuable information to educators about their students?

    • Does it provide valuable information to schools and districts about their educators?


Five considerations

Five Considerations

  • Measure growth

  • Common administration procedure 

  • Common scoring process

  • Translate to an Impact Rating

  • Comparability


What is comparability

What is comparability?

  • Comparable within a grade, subject, or course across schools within a district

    • Identical measures are recommended

  • Comparable across grade or subject level district-wide

    • Impact Ratings should have a consistent meaning across educators; therefore, DDMs should not have significantly different levels of rigor


Measuring student growth with ddms

Measuring Student Growth with DDMs


Approaches to measuring student growth

Approaches to Measuring Student Growth

  • Pre-Test/Post Test

  • Repeated Measures

  • Holistic Evaluation

  • Post-Test Only


Pre post test

Pre/Post Test

  • Description:

    • The same or similar assessments administered at the beginning and at the end of the course or year

    • Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year

  • Measuring Growth:

    • Difference between pre- and post-test.

  • Considerations:

    • Do all students have an equal chance of demonstrating growth?


Repeated measures

Repeated Measures

  • Description:

    • Multiple assessments given throughout the year.

    • Example: running records, attendance, mile run

  • Measuring Growth:

    • Graphically

    • Ranging from the sophisticated to simple

  • Considerations:

    • Less pressure on each administration.

    • Authentic Tasks


Repeated measures example running record

Repeated Measures Example Running Record

# of errors

Date of Administration


Holistic

Holistic

  • Description:

    • Assess growth across student work collected throughout the year.

    • Example: Tennessee Arts Growth Measure System

  • Measuring Growth:

    • Growth Rubric (see example)

  • Considerations:

    • Option for multifaceted performance assessments

    • Rating can be challenging & time consuming


Holistic example

Holistic Example

Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho.  Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts


Post test only

Post-Test Only

  • Description:

    • A single assessment or data that is paired with other information

    • Example: AP exam

  • Measuring Growth, where possible:

    • Use a baseline

    • Assume equal beginning

  • Considerations:

    • May be only option for some indirect measures

    • What is the quality of the baseline information?


Examples

Examples

  • Portfolios

    • Measuring achievement v. growth

  • Unit Assessments

    • Looking at growth across a series

  • Capstone Projects

    • May be a very strong measure of achievement


Piloting district determined measures

Piloting District Determined Measures


Piloting ddms

Piloting DDMs

  • Piloting:

    • Test

    • Analyze

    • Adjust

    • Repeat

  • Being strategic and deliberate:

    • Collaboration

    • Iteration

    • Information


Pilot steps

Pilot Steps:

  • Prepare to pilot

    • Build your team

    • Identify content to assess

    • Identify the measure

      • Aligned to content

      • Informative

    • Decide how to administer & score

  • Test

    • Administer

    • Score

  • Analyze

  • Adjust


  • Analyzing results example focus questions

    Analyzing Results: Example Focus Questions

    • Is the measure fair to special education students?

    • Is there differences in score due to rater?

    • Is growth equal across the scale?


    Analyzing and adjusting

    Analyzing and adjusting:

    Each DDM should have:

    • Directions for administering

    • Student directions

    • Instrument

    • Scoring method

    • Scoring directions


    Resources

    Resources

    • Existing

      • ESE Staff

      • Part VII of the Model System

      • Technical Guide A

      • Assessment Quality Checklist and Tracking Tool

      • Assessment Literacy Webinar Series

      • Materials from Technical Assistance sessions

      • Commissioner's Memorandum

      • Technical Guide B

    • What’s Coming

      • Exemplar DDMs (August 30th)

      • Other Supporting Materials


    Considerations

    Considerations

    See page 5-6 of Handout for DESE recommendations

    Table or Partner Talk


    Time to consider and begin to plan

    Time to Consider and Begin to Plan

    Pages 5-6 in Handout


    Can this be an opportunity

    Can this be an opportunity?

    • Some options:

      • Writing to text 9-12? K-12?

      • Research K-12? Including Specialists

      • Art, Music, PE, Health

      • Math—one focus K-12?

      • Are there present assessments that might be modified slightly


    One plan

    ONE PLAN

    Consider all of the options, concerns, initiatives, possibilities as you look at what the next step for your school and district should look like.

    Be ready to share this very basic “first think” on DDMs.

    After this, you will be given tools that will support your assessments of tasks and curricula’s quality, rigor, and alignment.


    The task predicts performance elmore

    “The task predicts performance”Elmore

    http://edworkspartners.org/expect-success/2012/09/21st-century-aligned-assessments-identify-develop-and-practice-2/

    Page 2 (The DO Now)

    Process with a partner. Why might Elmore’s idea be germane to your planning? What can educators learn from DDMs?


    Tools to facilitate the work

    Tools to Facilitate the Work

    Tools to assess Alignment

    Tools to assess Rigor

    Tools to assess the quality of student work


    2 dese tools to facilitate the tasks

    2 DESE Tools to Facilitate the Tasks

    Quality Tracking Tool

    Educator Alignment Tool

    Temporarily Removed from doe web site.

    Interactive data base for all educators and possible assessments that could be used for each.

    It has been taken down from the web site temporarily.

    • Assess the Quality of your inventory of assessments

    • Also use Lexicon of Quality Tracking Tool Terms (in packet)

    • On DESE website

      http://www.doe.mass.edu/edeval/ddm/


    Marshfield public schools district determined measures

    Tracking Tool Sample (page 11)

    • Checklist

    • Tracker


    Two essential quality considerations

    Two Essential Quality Considerations

    Alignment

    Rigor

    • Alignment toCommon Core,PARCC, and the District Curriculum

    • Shifts for Common Core have been made:

      • Complex texts

      • Multiple texts

      • Argument, Info, Narrative

      • Math Practices

      • Depth over breadth


    Daggert s rigor complexity scale

    Daggert’s Rigor/”Complexity” Scale

    “Task Complexity Continuum”

    1 234 5

    MCAS `MCAS PARCC CC Aligned Classrooms

    ORQ Compositionmultiple Authentic Tasks

    ELAORQ Mathtexts Simple/Complex


    Teacher alignment tool

    Teacher Alignment Tool


    More tools to guide the process

    More Tools to Guide the Process

    For Assessing Rigor and Alignment

    • Daggett’s Rigor/Relevance Scale

    • DESE’s Model Curriculum (Understanding by Design)

    • DESE’s Model Curriculum Rubrics (a destination)

    • PARCC’s Task Description

    • PARCC’s Rubrics for writing

    • Protocols for Calibration (to use with teacher groups)

    • Writing to Text Wikispace: http://tinyurl.com/l7287z9


    Writing to text and parcc

    Writing to Text and PARCC

    The Next Step?

    • The 2011 MA Frameworks Shifts to the Common Core

      • Complex Texts

      • Complex Tasks

      • Multiple Texts

      • Increased Writing

        A Giant Step?

        Increase in cognitive load

      • Mass Model Units—PBL with Performance-Based Assessments (CEPAs)

      • PARCC assessments require matching multiple texts


    Marshfield public schools district determined measures

    Understanding the Literary Analysis Task

    • Students carefully consider two literary texts worthy of close study.

    • They are asked to answer a few EBSR and TECR questions about each text to demonstrate their ability to do close analytic reading and to compare and synthesize ideas.

    • Students write a literary analysis about the two texts.


    Marshfield public schools district determined measures

    Grade 10 Prose Constructed-Response Item

    Use what you have learned from reading “Daedalus and Icarus” by Ovid and “To a Friend Whose Work Has Come to Triumph” by Anne Sexton to write an essay that provides an analysis of how Sexton transforms Daedalus and Icarus.

    As a starting point, you may want to consider what is emphasized, absent, or different in the two texts, but feel free to develop your own focus for analysis.

    Develop your essay by providing textual evidence from both texts. Be sure to follow the conventions of standard English.


    Research pp 15 16

    RESEARCH (pp. 15-16)

    • State assessments

    • Massachusetts Model Assessments (PBL/Performance Assessments)

    • Quality Performance Assessments (Capstones; units)

    • New PARCC question and task prototypes http://www.parcconline.org/samples/item-task-prototypes

    • Writing to Text on Wikispaces (My collected resources; many address the “giant step” of pairing texts, an increased cognitive load, and PARCC’s standard as well as curriculum models.) http://tinyurl.com/l7287z9


    Other tools ma model curricula and rubrics pp 6 10 cepas

    Other Tools: MA Model Curricula and Rubrics (pp. 6-10) CEPAs


    Understanding by design ma model curricula template

    Understanding by DesignMA Model Curricula Template


    Protocols to use locally for inter rater reliability looking at student work

    Protocols to Use Locally for Inter-Rater Reliability; Looking at Student Work

    • Developing rubrics

    • Developing exemplars

    • Calibrating scores

    • Looking at Student Work (LASW)

    • http://Nsfharmony.org/protocol/a_z.html

    • Sample for Developing Rubrics from an assessment

    Rather than first focusing on the work's quality, these processes often ask teachers to suspend judgment and describe its qualities--bringing multiple perspectives to bear on what makes students tick and how a school can better reach them.


    Marshfield public schools district determined measures

    High-Medium-Low (H-M-L) Protocol

    • This protocol reduces the initial anxiety of competition since everyone will have an example of each level.

    • New groups can use this as a beginning protocol to explore and develop shared expectations for student learning and performance.

    • Each teacher brings in two or three examples of high, medium, and low level work for a specific task, test, or prompt

    • A simple protocol to monitor improvement

    • Teachers identify patterns in student work

    • Teachers create an action plan based on the patterns

    • Or teachers develop rubrics as descriptors of levels of quality of student work


    Dese descriptors pp 27 29

    DESE Descriptorspp. 27-29


    Marshfield public schools district determined measures

    “Don’t let perfection get in the way of good.”“We are all in this together”“It will not be perfect”“We will be making mistakes along the way.”“We need your help to make the process better.”


    Trust

    Trust

    • Focus on Growth, not Gotcha

    • Form Joint Committees

    • Conversations between supervisor and educator is the most critical

    • Communication is Key

      • Joint Meetings/Union Leaders, Members and Administration

      • Engage School Committee

        -Training opportunities

        - Enlist support for rewarding exceptional teaching and for

        supervision of mediocre/ineffective teaching

        John Doherty


  • Login