1 / 10

TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT

TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT . RUTGERS GRADUATE SCHOOL OF EDUCATION. EDUCATIONAL ADMINISTRATION PROGRAM: RUTGERS GRADUATE SCHOOL OF EDUCATION. Major Purpose: Introduce ourselves and potential to be of assistance. Specific Purpose: Talk about teacher evaluation:

qamar
Download Presentation

TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT RUTGERS GRADUATE SCHOOL OF EDUCATION

  2. EDUCATIONAL ADMINISTRATION PROGRAM: RUTGERS GRADUATE SCHOOL OF EDUCATION Major Purpose: Introduce ourselves and potential to be of assistance. Specific Purpose: Talk about teacher evaluation: Bill Firestone: Preview of assessment of Teacher Evaluation Pilot (EE4NJ) Bruce Baker: How to understand value-added and growth data for teacher evaluation Melinda Mangin: Using teacher evaluation to improve teaching

  3. OVERVIEW • 2-Year Assessment of Teacher Evaluation Process • Sample: • 10 districts in 1st year. • 25 districts in 2nd year. • Diverse with respect to size, wealth, and location. • Mix of surveys (of teachers and administrators) and site visits. • Focused more on local use of teacher practice measures than student growth measures. • Asked about: • Perceptions of implementation • Orientations toward program. • Perceived facilitators & barriers

  4. Implementation • Ability to get complete set of observations of all teachers. • Year 1: • Based on analysis of teacher practice data submitted by 10 districts to NJDOE • Final summary evaluation provided for between 60% and 91% of teachers per district • Year 2: • Based on administrator survey in 25 districts • Number of project directors that expected to complete all required observations for each teacher: • 2nd year districts 8 of 11 • 1st year districts: 3 of 13 • In sum: Completing all observations is problematic. May be easier since fewer are required than in Year 2 of pilot.

  5. Factors Affecting Perceived Accuracy & Fairness • Consistency of raters in districts. • In 4 districts, focus group teachers noted that observers would disagree in rating same teacher. • In 3 districts, teachers noted that observers were consistent. • Distance from classroom. • Measurement experts say observers with less contact = fewer incentives to raise ratings. • Focus group teachers noted that observers with less contact would not know unusual contextual factors that might affect their practice. • 80% of surveyed teachers said important to “know my classroom well.” • Observer’s content knowledge: • 86% of surveyed teachers said it was important to have “content knowledge in your content area.”

  6. ORIENTATIONS II: Purpose & Tenure • Equal proportions of teachers think a major purpose of teacher evaluation is to: • Make tenure and promotion decisions: 57% • Provide information to help teachers improve practice: 58% • Most tenured teachers think they are unlikely to lose tenure under the new system: 68% • More new teachers think the new system will increase their chances of getting tenure--26% than decrease their chances--7%. • Less than half teachers think evaluation data helped their teaching: • 40-41% think observation data is helpful. • 46-49% think student growth data is helpful

  7. FACILITATORS/BARRIERS: Training on Observations • Administrators more satisfied with their training than teachers. • Percent getting 25 or more hours: • 2ndyr administrators: 37 • 1styr administrators 58 • 2nd year teachers 03 • 1st year teachers 07 • Training formats: • Important to continue training after initial orientation. • Issue may not be face-to-face v. online or media-based training. • Challenge is to provide opportunities for active learning: • Practice scoring (for administrators AND teachers) • Peer discussions • Model what teacher practice instruments say is good practice.

  8. FACILITATORS/BARRIERS: Time • From 86% to 96% of administrators said they were spending more time doing observations or entering data, depending on cohort & year. • Things affected by time crunch (from interviews): • Getting all observations done • Providing good documentation • Providing teachers with helpful feedback • Other work, including handling discipline issues, meetings with parents, being visible in building. • Things that helped get observations done: • An effective district-wide schedule for observations • Clear central administration focus on and support for doing observations • Sufficient staff or capacity to hire or reconfigure to create special observer cadre.

  9. FACILITATORS/BARRIERS: Data management tool • Can’t work with teacher practice data without data management tool: Teachscape, iObservation etc. • Learning data management tools = a substantial 1st year challenge. • The tools continue to raise problems. Percent finding data management tool easy to use for: • Purposes related to providing feedback to teachers: 31-36% • Purposes related to analyzing data and sharing reports: 14-17% • Reasons: • Lack of practice • Design flaws like excessive attention to individual confidentiality v. capacity to analyze.

  10. FINAL NOTE: DISTRICT VARIATION • Almost everything we looked at, substantial variation among districts. • Appeared to be some districts that consistently got more observations done & were perceived to be more accurate, fair, and helpful to teachers. • Have not yet been able to analyze what those districts did, but two hypotheses: • Pattern of integrating teacher evaluation into overall district improvement strategy. • Sufficient human capacity: numbers and knowledge of administrators in particular.

More Related