1 / 48

APPR Update

March 28 , 2011. APPR Update. Chapter 103 Review. What does the new law require?. New system for teachers (and principals). 20% State student growth data (increases to 25% upon implementation of value0added growth model) 20% Locally selected (and agree upon) measures (decreasing to 15%)

lahela
Download Presentation

APPR Update

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. March 28, 2011 APPR Update

  2. Chapter 103 Review What does the new law require?

  3. New system for teachers (and principals) • 20% State student growth data (increases to 25% upon implementation of value0added growth model) • 20% Locally selected (and agree upon) measures (decreasing to 15%) • 60% Multiple measures based on standards TDB

  4. New system for teachers (and principals) Being referred to as HEDI (pronounced Heidi) • Highly effective (possibly>90) • Effective (possibly80-90) • Developing (possibly65-79) • Ineffective (possibly 0-64)

  5. New system for teachers (and principals) • A single composite score of teacher (or principal) effectiveness

  6. New system for teachers (and principals) • Training for all evaluators (through Network Teams – after first week of August) • Use of improvement plans for developing and ineffective ratings • Utilize in other decisions (merit, etc.) • Locally-developed appeals process • Expedited 3020a process after two ineffective ratings

  7. New system for teachers (and principals) • All agreements after July 1, 2010 • For agreements prior to July 1, 2010, it depends on specific language in agreement • 4-8 math and ELA (and principals) July 2011 • Everyone else July 2012 • Implementation of the value-added growth model (20% > 25%) 2012-2013

  8. New system for teachers (and principals) • All agreements after July 1, 2010 • For agreements prior to July 1, 2010, it depends on specific language in agreement • 4-8 math and ELA (and principals) July 2011 • Everyone else July 2012 • Implementation of the value-added growth model (20% > 25%) 2012-2013

  9. Timetable Board of Regents Agenda

  10. Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations (4th) Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June

  11. State student growth data 20% increasing to 25%

  12. State student growth data • Value Added/Growth model • Annual achievement is more about the students than the teacher 680 670 2015 2015 Teacher A Teacher B

  13. State student growth data • Value Added/Growth model • Adding average prior achievement for the same students shows growth 680 +20 growth 670 +25 growth 660 645 2014 2014 2015 2015 Teacher A Teacher B

  14. State student growth data • Value Added/Growth model • Adding average prior achievement for the same students shows growth 680 +20 growth 670 +25 growth 660 645 2014 2014 2015 2015 Teacher A Teacher B

  15. State student growth data • Value Added/Growth model • But what growth should students have shown? • What growth did similar students obtain? • What is the difference between the expected growth and the actual growth?

  16. State student growth data • Value Added/Growth model • Comparing growth to the average growth of the similar student is the value-added 680 +20 growth +15 val add +5 val add 670 +25 growth 665 660 665 645 2015 avg. for similar students 2015 avg. for similar students 2014 2014 2015 2015 Teacher A Teacher B

  17. State student growth data • Value Added/Growth model • Comparing growth to the average growth of the similar student is the value-added 680 +20 growth +15 val add +5 val add 670 +25 growth 665 660 665 645 2015 avg. for similar students 2015 avg. for similar students 2014 2014 2015 2015 Teacher A Teacher B

  18. State student growth data • Calculating similar student growth • Lots of statistical analysis • Student characteristics such as academic history, poverty, special ed. status. ELL status, etc. • Classroom or school characteristics such as class percentages of needs, class size, etc.

  19. State student growth data • Data collection and policy options • Linking students, teachers, and courses • Who is the teacher of record? • Scenario 1: Same Teacher the Entire Year • Scenario 2: Team Teaching • Scenario 3: Teacher for Part of the Year • Scenario 4: Student for Part of the Year • Scenario 5: Student Supplemental Instruction • Additional Scenarios???

  20. State student growth data Non-tested areas

  21. Non-tested areas • Teachers of classes with only one state test administration • K-12 educators • High school (no test) educators • Middle and elementary (no test) educators • Performance courses • Others

  22. Non-tested areas • Use existing assessments in other content areas to create a baseline for science tests and Regents examinations • Use commercially available tests to create a baseline and measure growth

  23. Non-tested areas • Add more state tests, such as: • Science 6-8 • Social studies 6-8 • ELA 9-11 (2011-2012) • PARCC ELA 3-11 (2014-2015) • PARCC math 3-11 (2014-2015)

  24. Non-tested areas • Add more state tests, according to December 2009 Regents Item; discussed and approved prior to inclusion in SED’s plans: • ELA 9-11 (2011-2012)

  25. Non-tested areas • Add more state tests, subject to funding availability and approval, such as: • Science 6-7 • Social studies 6-8

  26. Non-tested areas • % growth model also can be used for school accountability measures • Collaborate with state-wide professional associations or a multi-state coalition • Empower local level resources to create and carry out a solution that meets state requirements

  27. Non-tested areas • Use a group metric that is a measure of the school (or grade’s) overall impact • In other states where this is implemented it tends to be tied to performance bonuses

  28. Local assessment measures 20% decreasing to 15%

  29. Local assessment measures • Objectives include: • Provide a broader picture of student achievement by assessing more • Provide a broader picture by assessing differently • Verify performance of state measures

  30. Local assessment measures • Reality check: • Balance state/regional/BOCES consistency while accounting for local context • School-based choice might appeal to teachers • Districts must be able to defend their decisions about the tests

  31. Local assessment measures • Considerations include: • Rigor • Validity and reliability • Growth or achievement measures • Cost • Feasibility • May be achievement or growth measure

  32. Local assessment measures • Options under consideration: • Districts choose or develop assessments for courses/grades • Commercially available products • Group metric of school or grade performance • Other options that meet the criteria (previous slide)

  33. Other 60% Multiple measures

  34. Other 60% • Begins with the teaching standards: • Knowledge of Students and Student Learning • Knowledge of Content and Instructional Planning • Instructional Practice • Learning Environment • Assessment for Student Learning • Professional Responsibilities and Collaboration • Professional Growth

  35. Other 60% • Begins with the teaching standards: • Some things observable • Some not observable thus requiring some other form or documentation or artifact collection

  36. Other 60% • Teacher practice rubrics: • Describe differences in the four performance levels • Articulate specific, observable differences in student and teacher behavior • Not known whether there will be a single rubric, menu to choose from, or total local option

  37. Other 60% • Teacher practice rubrics: • Describe differences in the four performance levels • Articulate specific, observable differences in student and teacher behavior • Not known whether there will be a single rubric, menu to choose from, or total local option

  38. Other 60% • Other items that might be included: • Teacher attendance • Goal setting • Student surveys • Portfolios/Evidence binders • Other observer

  39. Timetable Board of Regents Agenda

  40. Timetable Month Action 60% discussion Local 20% discussion Value added 20% discussion and ratings/scores Regents Task Force recommendations Draft Regulations Emergency Adoption of Regulations • January • February • March • April • May • June

  41. Timetable Month Action NT Training (included evaluator training) NT turns training to local evaluators Implementation for covered teachers • August • August • September

  42. NT training for teacher evaluators • Tentative dates set (with multiple options): • August 15, Rodax 8 Large Conference Room • August 22, McEvoy Conference Center • August 29, Rodax 8 Large Conference Room • Ongoing training during year (TBD)

  43. Training for principal evaluators • Tentative dates set (with multiple options): • August 19, Rodax 8 Small Conference Room • August 26, McEvoy Conference Center • Ongoing training during year (TBD)

  44. While we wait • Regional/BOCES collaboration: • Share data • Share APPR Plans • Build common understanding • Work on parts under local jurisdiction • Avoid duplication of work • Have a common voice

  45. While we wait • APPR sub-site: • APPR button under “for school districts” at ocmboces.org or leadership.ocmboces.org • User name: lrldocs • Password: CBA1011

  46. While we wait

  47. While we wait • Regional/BOCES collaboration: • Development of local 20% protocol • Achievement in non-tested areas • Qualities of effective Improvement plans and examples • Appeals process • Frameworks/models • Summative evaluation (examples, best practices, share practices) • Principal Evaluation (added back)

  48. Next steps • Share results of this afternoon’s work • Gather again on __________ • Updates • Continue collaboration

More Related