1 / 66

Denver Public Schools

Denver Public Schools. Unified Improvement Planning 102. August 15, 2012. Announcement: Training about Inside & Edge. In Spring 2012, DPS adopted NEW materials for middle and high school level ELLs: Inside materials for Middle School Edge materials for High School

Download Presentation

Denver Public Schools

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Denver Public Schools Unified Improvement Planning 102 August 15, 2012

  2. Announcement: Training about Inside & Edge In Spring 2012, DPS adopted NEW materials for middle and high school level ELLs: Inside materials for Middle School Edgematerials for High School Training: Saturday, August 25, 9am to 3:30pm at Del Pueblo. Please ask Teachers to register via SchoolNet.

  3. Why Our Work Matters The following slides provide global, national and local context for the work in which we’re engaged.

  4. Percent of Adults with an Associate Degree or Higher by Age Group - U.S. & Leading OECD Countries Source: OECD, Education at a Glance 2008

  5. Change in Population Age 25-44 By Race/Ethnicity, 2005-2025 Source: U.S. Census Bureau

  6. Difference in Education Attainment Between Whites and Hispanics (2006, Percent) Source: U.S. Census Bureau, 2006 American Community Survey (ACS) Public Use Microdata Sample (PUMS) File. Via NCHEMS

  7. Gaps - EthnicityTCAP Reading status • The Reading achievement gap between White students and American Indian, Asian, and Two or More Races students decreased slightly from 2011 to 2012. All other gaps remained relatively unchanged. • Every subgroup within DPS experienced growth on TCAP Reading from 2011 to 2012. 245 206 234 336 154 N = 215 N = 158 N = 98 N = 393

  8. Gaps - EthnicityTCAP Math status • The Math achievement gap between White students and Black, Hispanic, and Pacific Islander students increased from 2011 to 2012. All other gaps remained relatively unchanged. 245 206 234 336 154 N = 215 N = 158 N = 98 N = 380 N = 393 • 5 of the 7 subgroups experienced growth on TCAP Math from 2011 to 2012.

  9. Gaps - EthnicityTCAP Writing status • With the exception of Black students, the Writing achievement gap between White students and all other ethnic groups decreased slightly from 2011 to 2012. 245 206 234 336 154 N = 215 N = 158 N = 98 N = 380 N = 393 • 6 of the 7 subgroups experienced growth on TCAP Writing from 2011 to 2012.

  10. Gaps - EthnicityTCAP Science status • The Science achievement gap increased between American Indian and White students, but decreased between White and Asian and Two or More Races students. 245 206 234 336 154 N = 215 N = 158 N = 98 N = 380 N = 393 • 5 of the 6 subgroups experienced growth on TCAP Science between 2011 and 2012. * The Pacific Islander population is <16 and is therefore not included.

  11. Warming Up UIP Scavenger Hunt

  12. State Feedback on DPS UIPs The next 4 slides provide a summarization of the feedback received from CDE for DPS priority improvement and turnaround school UIPs submitted Jan 2012. All feedback listed in the slides was common among most of the DPS UIPs. Feedback in bold font indicates that almost all priority improvement and turnaround schools received this type of feedback. Feedback NOT in bold font indicates that about half or slightly less than half of the priority improvement and turnaround schools received this type of feedback.

  13. State Feedback on DPS UIPs Data Analysis Trend/Data analysis did not drill down far enough (i.e., disaggregation by grade level, subgroups, cohorts, etc.) Data is listed but there is no description of what the data tell us Exclusion of strategic focus on subgroups when data show subgroups underperforming Deeper root cause needed, asking more “why” questions. Too many priority performance challenges (CDE recommends 3 to 4) Trends concentrated on one content area or performance indicator Too many root causes

  14. State Feedback on DPS UIPs Data Narrative • Does not describe the processes used to prioritize the challenges or to identify and verify root causes • Data analysis lacks coherence: • Performance challenges do not relate directly to the data • Contradictions b/n data narrative and data in trend section • No connection between the data and root cause

  15. State Feedback on DPS UIPs • Target Setting • Exclusion of strategic focus on subgroups • Some schools did not mention how often they will school examine interim measures. • Some schools did not list interim measures: “Smart Goal Success” is not an interim measure

  16. State Feedback on DPS UIPs Action Planning • More detail in action steps (i.e., “what does Individualized PD and support” mean?) • Implementation benchmarks should include “how schools will measure completion and effectiveness”, “when” and “by whom”: • Current implementation benchmarks: • “Prior to each unit being taught.” • “Weekly via homework assignments and via parent conferences.” • “Teacher sign-in sheets.” • Example according to CDE critical quality criteria: • “100% of teachers will attend training in Oct 2012. Attendance will be measured by a sign in sheet at training. Principal walkthroughs with rubric will provide evidence of use of new training skills Nov-Dec 2012) • Plan only addresses 1 year • Total budget does not provide exact dollar amount • Major improvement strategies and action steps listed are insufficient to increase student performance.

  17. New UIP State Requirements/Updates • Pre-Populated Template & Data Analysis • Adequate Yearly Progress (AYP) is no longer a part of school or district accountability; metrics removed from UIP • CELApro growth including median growth percentiles and median adequate growth percentiles added to SPF/DPF and UIP Template. • Disaggregated graduation rates added to SPF/DPF and UIP Template.

  18. New UIP State Requirements/Updates • • English Language Proficiency – • CELA median student growth percentiles and median adequate growth percentiles in state SPF • If this is a priority performance challenge targets must be set in this area and reflected in the UIP template. • • Disaggregated Graduation Rates – • Grad rates for disaggregated groups in state SPF • If this is a priority performance challenge for the school/district, targets must be set in this area and reflected in the UIP template. • • Disaggregated Student Achievement – The state has established guidelines for setting targets for disaggregated student group performance. Starting in 2011-12, schools and districts should consider this guidance in establishing targets for student academic achievement. • These changes impact every section of the UIP.

  19. Consider Prior Year’s Performance • Review Prior Year’s Performance (Worksheet #1) • List targets set for last year • Denote the following: • Whether the target was met or not • How close the school/district was to meeting the target; and • To what degree does current performance support continuing with current major improvement strategies and action steps (NEW Addition)

  20. New UIP State Requirements/Updates UIP Data and Information Data Narrative: A Repository Description of School & Data Analysis Process School A has …. Review Current Performance Trend Analysis: Worksheets 1 & 2 Action Plans Target Setting/Action Planning Forms

  21. New UIP State Requirements/Updates • Data Narrative • More Guidance Provided • It serves as a repository for everything you do in the UIP process. • Elements to Include in the Data Narrative: • Description of the School Setting and Process for Data Analysis • Review Current Performance • State & Federal Accountability Expectations • Progress Towards Last Year’s Targets • Trend Analysis • Priority Performance Challenges • Root Cause Analysis • Throughout the school-year capture the following in the data narrative: • Progress Monitoring (Ongoing)

  22. New UIP State Requirements/Updates • UIP is a 2-Year Plan • The plan and following elements should cover 2 academic years: • Targets • Major Improvement Strategies • Associated Action Steps

  23. Digging Into Your UIP

  24. Locate The “Exemplar” Activity • You have three different color sheets at your table. • Review each of them and determine which one meets the critical quality criteria. • Discuss your selection with your colleagues. • Take a few seconds to compare the “exemplar” with your own UIP (status reading data/trend statements).

  25. Step 2: Identify Trends • Include all performance indicator areas. • Identify indicators* where the school did not at least meet state and federal expectations. • Consider data beyond that included in the school performance framework (i.e., local data). • Include positive and negative performance patterns. * Indicators listed on pre-populated UIP template include: status, growth, growth gaps and postsecondary/workforce readiness

  26. Writing Trend Statements: • Identify the measure/metrics. • Describe for which students (grade level and disaggregated group). • Include at least three years of data (ideally 5 years). • Describe the trend (e.g., increasing, decreasing, flat). • Identify for which performance indicator the trend applies. • Determine if the trend is notable and describe why.

  27. Examples of Notable Trends • The percent of 4th grade students who scored proficient or advanced on math TCAP/CSAP declined from 70% to 55% to 48% between 2009 and 2011 dropping well below the minimum state expectation of 71%. • The median growth percentile of English Language learners in writing increased from 28 to 35 to 45 between 2009 and 2011,meeting the minimum expectation of 45 and exceeding the district trend over the same time period. • The dropout rate has remained relatively stable (15, 14, 16) and much higher than the state average between 2009 and 2011.

  28. Disaggregating Data Yes No

  29. List of Subgroups

  30. Analyzing Trends: Keep in Mind… Because 0 • Be patient and hang out in uncertainty • Don’t try to explain the data • Observe what the data actually shows • No Because

  31. A path through the data. . . Review the SPF/DPF Report to identify where performance did not at least meet expectations (federal/state/local) Select one content area on which to focus Consider performance (achievement/growth) by grade level for 3+ years Consider performance by disaggregated group by grade level for 3+ years Disaggregate groups further Within grade-levels consider achievement by standard/sub-content area Look across groups Consider cross-content area performance (3 + years) Consider PWR metrics over 3+ years

  32. Priority Performance Challenges

  33. Priority Performance Challenges Examples • The percent of fifth grade students scoring proficient or better in mathematics has declined from 45% three years ago, to 38% two years ago, to 33% in the most recent school year. • For the past three years, English language learners (making up 60% of the student population) have had median growth percentiles below 30 in all content areas. • Math achievement across all grade-levels and all disaggregated groups over three years is persistently less than 30% proficient or advanced.

  34. Priority Performance Challenges Non-Examples • To review student work and align proficiency levels to the Reading Continuum and Co. Content Standards • Provide staff training in explicit instruction and adequate programming designed for intervention needs. • Implement interventions for English Language Learners in mathematics. • Budgetary support for para-professionals to support students with special needs in regular classrooms. • No differentiation in mathematics instruction when student learning needs are varied.

  35. What is a Root Cause? • Root causes are statements that describe the deepest underlying cause, or causes, of performance challenges. • They are the causes that, if dissolved, would result in elimination, or substantial reduction, of the performance challenge(s). • Root causes describe WHY the performance challenges exist. • Things we can change and need to change • The focus of our major improvement strategies. • About adult action.

  36. Steps in Root Cause Analysis • Focus on a performance challenge (or closely related performance challenges). • Consider External Review results (or categories) • Generate explanations (brainstorm) • Categorize/ classify explanations • Narrow (eliminate explanations over which you have no control) and prioritize • Deepen thinking to get to a “root” cause • Validate with other data Iterative

  37. Root Cause Activity Priority Performance Challenge: The percent of English Language Learners (74% of students) scoring proficient or above in mathematics has declined from 45% three years ago, to 38% two years ago, to 33% in the most recent school year. • Work with your table partners to determine the potential root causes for the performance challenge. • Follow steps on previous slide. (7 minutes)

  38. Levels of Root Causes

  39. From Priority Performance Challenge to Root Cause……

  40. Important to Verify Root Cause Ask the key questions for identifying whether a cause is a root cause: • Would the problem have occurred if the cause had not been present? • Will the problem reoccur if the cause is corrected or dissolved? Make any final revisions to your root cause explanation as needed. Preuss, P. (2003). Root Cause Analysis: School Leaders Guide to Using Data to Dissolve Problems, Larchmont, NY: Eye on Education.

  41. Verify Root Causes (Examples) 0 Priority Performance Challenge: The % proficient/adv students in reading has been substantially above state expectations in 3rd grade but substantially below stable (54%, 56%, 52%) in 4th and 5th for the past three years.

  42. Break & Reflection Time • Break Time: 10 minutes

  43. How to set annual performance targets. . . Focus on a priority performance challenge Review state or local expectations Determine timeframe (max 5 years) Determine progress needed in first two years Describe annual targets for two years Action Planning Tools, p. 9

  44. Minimum Expectations for Target Setting(2011 Data)

  45. DPS District Accountability • DPS’ goal is to close or significantly reduce the academic achievement and postsecondary readiness gaps between DPS and the state by 2015. DPS’ SPF is used to evaluate school performance as schools pursue the district’s goal. • School specific targets have been established that represent each school’s share of the district’s goal annually through 2015. Schools should strive to achieve these targets at minimum; however, some schools will need to set higher targets.

  46. Academic Achievement Targets

  47. Activity:Did You Meet Your DPS Targets? • Compare your current TCAP data to the 2012 targets that were set for your school in all 4 content areas. • In which content area(s) did you meet the target? • In which content area(s) did you not meet the target? • Which content areas need to be a focus for the ’12-’13 school year? • How will you use your current performance data to set targets for the ’12-’13 school year?

  48. Guiding Questions for Target Setting • Is the target aligned to the Priority Performance Challenge? • How will subgroups be addressed within the target? • What will be the amount of progress you expect to make in the next 2 years? • What implications will the current target have for meeting the 5 year goal?

  49. Interim Measures • Once annual performance targets are set for the next two years, schools must identify interim measures, or what they will measure during the year, to determine if progress is being made towards each of the annual performance targets. • Interim measures should be based on local performance data that will be available at least twice during the school year.

  50. Interim Measures • Examples of Interim Measures: • District-level Assessments: Benchmarks/Interims, STAR, SRI • School-level Assessments: End of Unit Assessments, DIBELS • Measures, metrics and availability should be specified in the School Goals Form. • Remember that the Interim Measures need to align with Priority Performance Challenges. Disaggregated groups should be included as appropriate.

More Related