1 / 39

APPR Update

March, 2011. APPR Update. Chapter 103 Review. What does the new law require?. New system for teachers (and principals). 20% State student growth data (increases to 25% upon implementation of value0added growth model) 20% Locally selected (and agree upon) measures (decreasing to 15%)

foster
Download Presentation

APPR Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. March, 2011 APPR Update

  2. Chapter 103 Review What does the new law require?

  3. New system for teachers (and principals) • 20% State student growth data (increases to 25% upon implementation of value0added growth model) • 20% Locally selected (and agree upon) measures (decreasing to 15%) • 60% Multiple measures based on standards TDB

  4. New system for teachers (and principals) • Highly effective • Effective • Developing • Ineffective

  5. New system for teachers (and principals) • A single composite score of teacher (or principal) effectiveness

  6. New system for teachers (and principals) • Training for all evaluators (through Network Teams) • Use of improvement plans for developing and ineffective ratings • Utilize in other decisions • Locally-developed appeals process • Expedited 3020a process after two ineffective ratings

  7. New system for teachers (and principals) • All agreements after July 1, 2010 • 4-8 math and ELA (and principals) July 2011 • Everyone else July 2012 • Implementation of the value-added growth model (20% > 25%) 2012-2013

  8. Timetable Board of Regents Agenda

  9. Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June

  10. Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June

  11. State student growth data 20% increasing to 25%

  12. State student growth data • Value Added/Growth model • Annual achievement is more about the students than the teacher 680 670 2015 2015 Teacher A Teacher B

  13. State student growth data • Value Added/Growth model • Adding average prior achievement for the same students shows growth 680 +20 growth 670 +25 growth 660 645 2014 2014 2015 2015 Teacher A Teacher B

  14. State student growth data • Value Added/Growth model • Adding average prior achievement for the same students shows growth 680 +20 growth 670 +25 growth 660 645 2014 2014 2015 2015 Teacher A Teacher B

  15. State student growth data • Value Added/Growth model • But what growth should students have shown? • What growth did similar students obtain? • What is the difference between the expected growth and the actual growth?

  16. State student growth data • Value Added/Growth model • Comparing growth to the average growth of the similar student is the value-added 680 +20 growth +15 val add +5 val add 670 +25 growth 665 660 665 645 2015 avg. for similar students 2015 avg. for similar students 2014 2014 2015 2015 Teacher A Teacher B

  17. State student growth data • Value Added/Growth model • Comparing growth to the average growth of the similar student is the value-added 680 +20 growth +15 val add +5 val add 670 +25 growth 665 660 665 645 2015 avg. for similar students 2015 avg. for similar students 2014 2014 2015 2015 Teacher A Teacher B

  18. State student growth data • Calculating similar student growth • Lots of statistical analysis • Student characteristics such as academic history, poverty, special ed. status. ELL status, etc. • Classroom or school characteristics such as class percentages of needs, class size, etc.

  19. State student growth data • Data collection and policy options • Linking students, teachers, and courses • Who is the teacher of record? • Scenario 1: Same Teacher the Entire Year • Scenario 2: Team Teaching • Scenario 3: Teacher for Part of the Year • Scenario 4: Student for Part of the Year • Scenario 5: Student Supplemental Instruction • Additional Scenarios???

  20. State student growth data Non-tested areas

  21. Non-tested areas • Teachers of classes with only one state test administration • K-12 educators • High school (no test) educators • Middle and elementary (no test) educators • Performance courses • Others

  22. Non-tested areas • Use existing assessments in other content areas to create a baseline for science tests and regents • Use commercially available tests

  23. Non-tested areas • Add more state tests, such as: • Science 6-8 • Social studies 6-8 • ELA 9-11 (2011-2012) • PARCC ELA 3-11 (2014-2015) • PARCC math 3-11 (2014-2015)

  24. Non-tested areas • Add more state tests, such as: • Science 6-8 • Social studies 6-8 • ELA 9-11 (2011-2012) • PARCC ELA 3-11 (2014-2015) • PARCC math 3-11 (2014-2015) ???

  25. Non-tested areas • Use a group metric that is a measure of the school (or grade’s) overall impact • % growth model also can be used for school accountability measures • Empower local level resources to create and carry out a solution that meets state requirements

  26. Local assessment measures 20% decreasing to 15%

  27. Local assessment measures • Objectives include: • Provide a broader picture of student achievement by assessing more • Provide a broader picture by assessing differently • Verify performance of state measures

  28. Local assessment measures • Reality check: • Balance state/regional/BOCES consistency while accounting for local context • School-based choice might appeal to teachers • Districts must be able to defend their decisions about the tests

  29. Local assessment measures • Considerations include: • Rigor • Validity and reliability • Growth or achievement measures • Cost • Feasibility

  30. Local assessment measures • Options under consideration: • Districts choose or develop assessments for courses/grades • Commercially available products • Group metric of school or grade performance • Other options that meet the criteria (previous slide)

  31. Other 60% Multiple measures

  32. Other 60% • Begins with the teaching standards: • Knowledge of Students and Student Learning • Knowledge of Content and Instructional Planning • Instructional Practice • Learning Environment • Assessment for Student Learning • Professional Responsibilities and Collaboration • Professional Growth

  33. Other 60% • Begins with the teaching standards: • Some things observable • Some not observable thus requiring some other form or documentation or artifact collection

  34. Other 60% • Teacher practice rubrics: • Describe differences in the four performance levels • Articulate specific, observable differences in student and teacher behavior • Not known whether there will be a single rubric, menu to choose from, or total local option

  35. Other 60% • Teacher practice rubrics: • Describe differences in the four performance levels • Articulate specific, observable differences in student and teacher behavior • Not known whether there will be a single rubric, menu to choose from, or total local option

  36. Other 60% • Other items that might be included: • Teacher attendance • Goal setting • Student surveys • Portfolios/Evidence binders • Other observer

  37. Timetable Board of Regents Agenda

  38. Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June

  39. Timetable Month Action NT Training (included evaluator training) NT turns training to local evaluators Implementation for covered teachers • July • August • September

More Related