1 / 20

March 8, 2011

Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee. March 8, 2011. Overview of the Webinar. Review of guiding questions for sub-committee consideration

romney
Download Presentation

March 8, 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student Outcomes and Principal Evaluation: Key Questions for PEAC Principal Evaluation Subcommittee March 8, 2011

  2. Overview of the Webinar • Review of guiding questions for sub-committee consideration • Introduction and review of value-added measures and update on value-added models being created in CPS • Discussion of guiding questions New Leaders for New Schools

  3. Overview of the Webinar • Review of guiding questions for sub-committee consideration • Introduction and review of value-added measures and update on value-added models being created in CPS • Discussion of guiding questions New Leaders for New Schools

  4. Guiding questions on student outcomes • What measures should be used in evaluating principals? • What is the right balancebetween value-added growth measures and attainment measures? • How, if at all, should be adjust our judgments based on a school’s demographicsand other characteristics, like student mobility? • How many years of datashould be used for any single year’s rating on student growth? • What processes and parameters should guide local flexibility and adaptation to the state system over time? • For each of these categories, we identify specific questions (noted in bold) and considerations based on research and our experience (noted in italics). New Leaders for New Schools

  5. Measures of Student Outcomes: K-8 • Should we use ISAT data? • Better matched to principal evaluation than teacher evaluation: • Larger pool of students in growth analyses allows for less variability in direction of results • Clearer attribution of students to principal (with clear mobility parameters) • Serves as one important element of student outcomes piece, but helpful if balanced with non-test outcomes (in high school) or other assessment data (as long as it is consistent across the LEA) • Important to use multiple years of information to establish trend • Can be used to measure attainment (e.g., % of kids meeting proficiency), gain/growth (e.g., increase in % of kids meeting proficiency), and value-added • Should we use interim assessments? • Technically sound but some cautions • More reliable than summative tests if computer adaptive • Assessments may not cover all content • Students may not take interim assessments seriously • Such assessments not meant for use as accountability tools • From 2014-15, the PARCC assessments should provide an integrated solution to interim and summative assessments New Leaders for New Schools

  6. Measures of Student Outcomes: K-8 (continued) • Should we use school-level student outcome goals set by principals and their managers? • Common practice, but depends on rigor of principal manager expectations • What other measures of student growth, beyond state tests, should we consider? • Measures of student aspirations toward college in middle school grades • Student attendance New Leaders for New Schools:

  7. Measures of Student Outcomes:High School Considerations • Should we use PSAE data? • Can be used to assess subjects beyond reading and math (i.e., writing, science) • Can be used as an attainment measure (% of students reaching proficiency) and as a growth (increase in % of students reaching proficiency) • Substantial technical issues in converting these data to value-added estimates • Gap between 8th grade ISAT and 11th grade PSAE, with data distortion from dropouts and retained students • Anticipate improved ability to make value-add estimates using PARCC assessments in 2014-15 and onward • What other measures of student growth, beyond state tests, should we consider? • High school student growth measures should expand beyond state tests to include “on track” to college measures: • Student attendance • Grade to grade progression • Credit accumulation (potentially including “quality of credits”) • Cohort graduation rates, and quality of diploma earned (if data exists) • Note: These measures can be turned into “value added” metrics, by looking at predicted values versus actual values at the school level New Leaders for New Schools:

  8. Balancing attainment, growth, and value-add • How should we weight attainment, growth and value-add within an overall rating? • Focusing on more on movement measures (i.e., gain/growth, value-add) • Provides a better picture of the impact of the principal • Creates stronger incentives for principals to work in lower performing schools • Pushes schools with higher performing incoming students to keep advancing their performance (past “proficiency” to “college-ready”) • Values all students by assessing progress from their starting points • Requires districts to look at same-student comparisons rather than “cohort to cohort” comparisons whenever possible • Where possible, use multiple growth measures • Relative weight on attainment (or on maintenance of growth) might increase as performance level of school increases • Should we treat low-performing schools and high-performing schools differently or the same? • There is a ceiling to growth on proficiency, suggesting two changes for high-performing schools: • Give schools gain/growth points if they exceed a proficiency ceiling (e.g. Chicago approach) • Tie a portion of the gain/growth goal to their success in increasing the percent of students meeting the “advanced” category on current assessments New Leaders for New Schools:

  9. Balancing attainment, growth, and value-add: An illustration Shift to growing the percentage of students reaching “advanced” Emphasis on measures of growth Reward principals for maintaining high levels of achievement New Leaders for New Schools:

  10. DCPS Principal Evaluation Components

  11. New York City Principal Evaluation Components • 40-50% of the New York Evaluation is made up of Student Outcome data • 26% of a school’s graded progress is focused on student outcomes • 14-24% of the School Specific Goals are focused on student outcomes *In New York City, a School Quality Review is a two- or three-day visit by experienced educators to a school. The visit typically includes classroom observations, conversations with school leaders and stakeholders, and examinations of student work. New York City has developed a rubric to guide the visits and to determine how well organized a school is to educate its students.

  12. Chicago “Performance Calculators” for Principals New Leaders for New Schools:

  13. Adjusting for student characteristics • Should we include controls in the value-added growth models to account for student characteristics? • Increases the accuracy of value-added estimates • Controls can be changed from year to year to alter the approach to a given population (e.g., special education, English language proficiency, homelessness)   • There may be some value in excluding some controls – at the sake of maximal accuracy of estimates – in order to signal heightened responsibility for schools to accelerate achievement for low income students of color. • Should we give extra weight for improving results for students who start out further behind? • Set targets that expect faster growth for lower performing students in the district/state • How should we address the question of student mobility? • VARC and others use methods that assign portions of value-added growth to a school based on the percentage of the school year a student has been enrolled at the school. New Leaders for New Schools:

  14. Years of data used for judgments of principals • How many years of data should be used for any single year’s rating on student growth? • Given the variation in single-year results, evaluate student outcomes based on multi-year trends • Note: Value-added estimates are more reliable at the school-level than at the classroom level, since higher student numbers reduce the impact of year-to-year fluctuations. BUT, we want to create incentives for long-term improvement, not quick fixes. • Provide additional time or use more years of data for early tenure principals • Plan for the availability of sufficient data before any significant consequences (e.g. ensuring most recent test data is available before making spring retention decisions) New Leaders for New Schools:

  15. Processes for adaptation • What guidelines do we put in place for all districts to follow if they want to design their own systems? • The balance of growth and attainment should be fixed. • Measuring success in other academic subjects depends on the presence of reliable local assessments. • The technical capability to develop and implement value-added models is not present in most districts. • What should be the ongoing process for evaluating the system and adapting it? • Among other things, the state will need to adjust its test measures when the PARCC assessments are rolled out in 2014-15 New Leaders for New Schools:

  16. Overview of the Webinar • Review of guiding questions for sub-committee consideration • Introduction and review of value-added measures and update on value-added models being created in CPS • Discussion of guiding questions New Leaders for New Schools:

  17. Common Approaches to Measuring Student Success Our overall goal is to measure the performance of a principal based on student performance. How is this accomplished? Source: VARC (http://varc.wceruw.org/tutorials/Oak/index.htm) New Leaders for New Schools:

  18. Understanding Value-Added Measures Stephen Ponisciak Value-Added Research Center School of Education, University of Wisconsin-Madison New Leaders for New Schools:

  19. Overview of the Webinar • Review of guiding questions for sub-committee consideration • Introduction and review of value-added measures and update on value-added models being created in CPS • Discussion of guiding questions New Leaders for New Schools:

  20. Guiding questions on student outcomes • What measures should be used in evaluating principals? • What is the right balancebetween value-added growth measures and attainment measures? • How, if at all, should be adjust our judgments based on a school’s demographicsand other characteristics, like student mobility? • How many years of datashould be used for any single year’s rating on student growth? • What processes and parameters should guide local flexibility and adaptation to the state system over time? New Leaders for New Schools:

More Related