1 / 12

Changes to Quality Review in 2010-11

Changes to Quality Review in 2010-11. To address the areas of concern highlighted by data and critiques of the QR process throughout 2009-10, we have made changes to the: Quality Review rubric Quality Review scoring guidelines Quality Review selection criteria

mareo
Download Presentation

Changes to Quality Review in 2010-11

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Changes to Quality Review in 2010-11 • To address the areas of concern highlighted by data and critiques of the QR process throughout 2009-10, we have made changes to the: • Quality Review rubric • Quality Review scoring guidelines • Quality Review selection criteria • Quality Review site visit protocols • Quality Review report

  2. 1. Quality Review Rubric • Highlights of changes to the rubric are below. A color-coded version of the rubric at the QR page of the DOE website clearly depicts each change from 2009-10 to 2010-11. • Articulated Underdeveloped column and moved language down in indicators to more accurately capture lowest level of practice observed • UPF  now labeled “Developing” • Inserted language regarding “Across classrooms…” in various areas of the rubric • Indicator 2.2 now focuses more explicitly on assessment quality and coherence with curriculum • Integrated language referring to the Common Core State Standards (4.3, 5.1, 5.2, 5.3)

  3. 2. Quality Review Scoring Guidelines • The scoring guidelines are changing to a point-based system with cut scores between quality categories. • A school will earn points on each of the 20 indicators and these points will directly add up to the overall score. • This shift solves a pressing concern regarding fairness. In the past, the scoring policy allowed for two schools to earn the same array of indicators and receive different overall scores depending on the way in which indicator scores were distributed. • Example. a school with four Proficient indicators and 16 Well Developed indicators was scored Proficient overall if pairs of Proficient indicators fell in two separate Quality Statements; another school with the same number of Proficient and Well Developed indicators was rated Well Developed overall when each of the Proficient indicators fell in four separate Quality Statements.

  4. 2. Quality Review Scoring Guidelines (cont.) • The point-based scoring guidelines also offer the opportunity to weight key indicators more than others. The following indicators will be double in scoring weight: • 1.1: Rigorous and accessible curriculum • 1.2: Differentiated classroom practices and pedagogy • 1.3: Leveraging structures, technology, and resources to improve student outcomes • 2.2: Assessment quality and alignment to curriculum • 4.1: Data-informed staff support and performance evaluation decisions

  5. 2. Quality Review Scoring Guidelines (cont.) • Using the following point scale: • Well Developed 4 points • Proficient 3 points • Developing 2 points • Underdeveloped 1 point • with a total of 20 indicators, five of which are weighted with double value: • the highest score possible on a Quality Review is 100, and • the lowest score possible on a Quality Review is 25.

  6. 2. Quality Review Scoring Guidelines (cont.) • The chart below shows the cut scores and scoring ranges. • The cut line between Well Developed and Proficient remains essentially the same as in 2009-10. The cut lines for Proficient and Developing return to levels similar to those required for Proficient and UPF in 2008-09. • An excel file “QR Scoring Calculator” has been created to aid score tallying; it is available for download on the Quality Review page of the DOE website.

  7. 3. Quality Review Selection Criteria • Given the results of State tests in the lower grades and the alterations to our system’s Progress Reports, we would be slated to review over 1150 schools if we used the QR selection criteria from 2009-10.  • Therefore we are changing the criteria with the purpose of ensuring that every school experiences a review within a four-year cycle. The following criteria will trigger a Quality Review during 2010-11: • 2009-10 Progress Report of F, D, or third C in a row (2007-08, 2008-09, and 2009-10) • 2009-10 Quality Review of UPF or U • Schools in their second year (opened in September 2009)* • Schools identified as Persistently Lowest Achieving by New York State • Schools with Principals at risk of not receiving tenure • Schools chosen from a lottery, within districts, that have not had a review since 2007-08; schools that do not receive a review this year will receive one next year.* * See slide on Peer Reviews

  8. 3. New School Quality Reviews (NSQR) • Schools opening in 2010-11 will have a one-day New School Quality Review (NSQR). • As in 2009-10, these reviews will be conducted by the network team and the reports will be shared internally but not published or used for accountability purposes. • For more information, see the NSQR documents on the QR webpage of the NYCDOE site.

  9. Peer Review Proposal • In the last year, DPA documented a number of networks and schools that piloted different models of peer visits and reviews, all with significant positive feedback (see the QR Promising Practices Library: https://www.arisnyc.org/connect/node/813911). • Every school is encouraged to engage in these formative intervisitations. The option of a more formalized Peer Review process is being considered for: • Schools in their second year (opened in 09-10) • Schools in the selection lottery showing a sustained history of significant gains, i.e. a grade of “A” on the Progress Report in 07-08, 08-09, 09-10. • Under this proposal, the Peer Review would occur in lieu of an external Quality Review. Reports would be shared internally but not published or used for accountability purposes. DSSI and DPA will be communicating to you shortly about further details of this proposal.

  10. 4. Quality Review Site Visit Protocols • Almost all of the site visit protocols will remain the same. • At least one of the two teacher team meetings must exhibit an examination of student work in the presence of teacher work (curriculum, academic tasks, assessments/rubrics, etc.). More details will be provided in coming weeks regarding this expectation. • Both teacher team meetings will provide an opportunity for the reviewer to triangulate information on, among other things, how the school is approaching the evolving nature of the New York State standards (i.e. implications of the Common Core State Standards).

  11. Follow Up • Opportunities for principals and network teams to learn more about the Quality Review in 2010-11 are being arranged by Cluster leadership. These sessions will occur in mid to late September. • For more information, go to the Quality Review webpage of the NYCDOE website. Supporting documents are housed there. • Visit the Quality Review Promising Practices Library (QR-PPL) in ARIS Connect for resources related to the Quality Review Rubric. There are research articles, narratives of NYC school practices, and videos aligned to the Well Developed language of the rubric. You will find a link to the QR-PPL on the home page of ARIS – see the lower right side of the home page. • If you have questions, write to the Quality Review team at qualityreview@schools.nyc.gov, or discuss with your network/cluster SATIF.

More Related