1 / 41

Music Teacher Evaluation in Michigan Dr. Phillip M. Hash, Calvin College pmh3@calvin

Music Teacher Evaluation in Michigan Dr. Phillip M. Hash, Calvin College pmh3@calvin.edu. January 7, 2012. Overview of PM Workshop. Overview of Legislation Creating an Assessment Plan Assessment Strategies Your Experience. Overview of Michigan Legislation Signed July 19, 2011.

conor
Download Presentation

Music Teacher Evaluation in Michigan Dr. Phillip M. Hash, Calvin College pmh3@calvin

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Music Teacher Evaluation in MichiganDr. Phillip M. Hash, Calvin Collegepmh3@calvin.edu January 7, 2012

  2. Overview of PM Workshop • Overview of Legislation • Creating an Assessment Plan • Assessment Strategies • Your Experience

  3. Overview of Michigan LegislationSigned July 19, 2011 • “…with the involvement of teachers and school administrators, the board of a school district or intermediate school district or board of directors of a public school academy shall adopt and implement for all teachers and school administrators a rigorous, transparent, and fair performance evaluation system that does all of the following:” • Measures student growth (vs. achievement or proficiency) • Provides relevant data on student growth (quantitative) • Evaluates a teacher's job performance “using multiple rating categories that take into account data on student growth as a significant factor” (PA 102, p. 2).

  4. Overview of Michigan Legislation(cont.) • % of eval. related to student growth • All teachers evaluated annually • “A classroom observation shall include a review of the teacher’s lesson plan and the state curriculum standard being used in the lesson and a review of pupil engagement in the lesson.” • Rated as: Highly effective, effective, minimally effective, ineffective

  5. Using Growth in 2011-12 and 2012-13 • Evaluations this year involve locally developed systems w/ student growth as a significant part • Need to figure out ways to integrate growth data into your evaluation systems (how?) • Law allows for “national, state, or local assessments and other objective (vs. subjective) criteria.”

  6. Student Assessment Tool • “All student growth and assessment data must be measured using the student growth assessment tool required in legislation based on recommendations of the Governor's Council on Educator Effectiveness.” (MI Senate, 2011) • The tool also must meet the following: • Measure student growth in all subjects (tested & non-tested) • Comply with all current law for students with a disability. • Include at least a pre- and post-test. • Useable for pupils of all achievement levels.

  7. Overview of MI Legislation • Ratings vs. seniority in personnel decisions • Probationary period increased from 4 to 5 yrs. • Administrators evaluated under same basic framework

  8. MDE Will Provide • Measures For every educator, regardless of subject taught, based on 2009-10 and 2010-11 data: • Student growth levels in reading and math • Student proficiency levels in math, reading, writing, science, social studies • Foundational measure of student proficiency and improvement (same for each teacher in a school) Understanding Michigan's Educator Evaluations, MDE (December 2010) • How will this data be used for arts educators? • Currently up to school districts • Might be specified by the state after this year

  9. Governor’s Council on Educator Effectiveness • By April 30, 2012 submit a report that recommends • a student growth and assessment tool • State evaluation tools for teachers and administrators • parameters for effectiveness rating categories. • Subject to leg. approval

  10. Governor’s Council Appointees • Deborah Ball – Dean, UM School of Ed. • Mark Reckase – Professor, MSU College of Education • Nicholas Sheltrown – Director, National Heritage Academics • David Vensel – Principal, Jefferson HS • Jennifer Hammond – Principal, Grand Blanc High School • Joseph Martineau – Director, MDE Bureau of Assessment & Accountability

  11. Performance-Based Compensation • A district shall implement a compensation method for teachers and administrators that includes “job performance and job accomplishments as a significant factor” to determine “compensation and additional compensation.” MCL 380.1250(1) • Meaning for arts educators?

  12. New Prohibited Bargaining Subjects • 1. Teacher Placement • 2. Reduction in Force/Recall • 3. Classroom Observation • 4. Sec. 1249 Performance Evaluation • 5. Teacher Discharge/Discipline • 6. Performance-Based Compensation • 7. Parent Notification (§ 1249a) • PERA Section 15(3), PA 103 of 2011

  13. Creating an Assessment Plan • District Music Faculty (by area) • Est. curriculum based on MI Standards • What should students in each grade level know and be able to do? • How and when will objectives be assessed? • Perhaps not every grade every year • How will assessments show growth? (Define NP, PP, P, HP?) • Take plan to administration for approval • Law says that “with the involvement of teachers” • Pilot, Review, Revise, Implement

  14. Developing Local Assessment Strategies

  15. Assessment Terms • Reliability = Consistency • Test/retest (regardless of yr., location, etc.) • Interrater (every judge the same) • Validity = the extent to which an assessment measures what they purport to measure • Authentic Assessment = Students demonstrate knowledge and skills in real-world context (e.g., performance) • Quantitative – data is numerical (anything that can be counted, percentages) • Qualitative – data is in words (descriptions, written critiques) • Formative vs. Summative – • Formal vs. Informal -

  16. Possible Measures • Individual performance exam(s) • Written exam(s) • Festival Ratings (?) or other group measure • Combination of above • Indiv. student growth • Based on progressive materials • Standards based

  17. MI Grade Level Content Expectations(June 2011) • What students should know and be able to do in grades K-8, & HS • Aligned w/ VPAA & 21st century skills • Standards, & benchmarks by grade level • Teachers evaluated on use of standards • [See handout]

  18. Assessment Characteristics • National, state, & local tests, “and other objective criteria” • Pre & post test • Previous year’s test might serve as pre-test • Objective = quantitative grading[?] • Data must translate to a score • Rubrics for subjective assessments • Multiple measures • Administratively more time consuming • Students demo learning in multiple ways • Progressive assessments

  19. Showing Growth – One Way • For each assessment • Determine Proficiency levels (NP, PP, P, A) the same way you would determine grades • BUT keep in mind MEAP scores for proficiency! • Create baseline scores to determine proficiency levels • Instruction time, resources, & other factors • Proficient • 3rd math = 72% • 4th writing = 60% • 7th writing = 68% • Partially Proficient ranges from 57%-83% depending on subject & grade • Average = 64.45

  20. Performance Level Change (“growth”)

  21. One Possible Method • Weight the PLCs to give educators more credit for more student improvement and to take away credit for declines. • One possible rating system: • Score/# students = growth score

  22. Possible Method (cont’d) • Could adjust the weights if desired—more/less credit for SI or SD, etc. • Another possibility: If the student scored in the “Advanced” category in the previous year, and is still in the “Advanced” category, award them a weight of “improving” even if they maintained or declined.

  23. Example (cont.) • To calculate the teacher’s percent of students demonstrating growth, divide Weighted PLC by number of students: 3/8 = 37.5% • Compare growth score with the district, school, or grade level average to determine teacher effectiveness in the student growth category (50% by 2015-16)

  24. Insuring Integrity • Self created, administered, and graded assessments • Colleagues & administrators will ask • Standards Based assessments • Identical throughout the district • Demonstrate validity & reliability • Explain/demonstrate process for creating, administering, & grading • Demonstrate connection b/w state standards and assessments • Archive recordings

  25. www.vocaroo.com • Audio emails • Archived up to 5 months • Sends link to an email address • Download as .WAV or .Ogg • Useful for performance tests • Very easy! • http://vocaroo.com/?media=vAdx5RJr1DVC7upIc

  26. Carnegie Hall Royal Conservatory Achievement Programhttp://www.theachievementprogram.org/program/all-other-programs-syllabi • Recorder, strings, woodwinds, brass, percussion, voice • Graded preparatory, 1-10 • RC Grade 8 considered college entrance • Includes solos, etudes, scales/arpeggios, ear training, sight reading, theory

  27. CHRCAP (cont.) • ALTER & ADAPT for your own situation • e.g., all band instrument does same scales • Substitute method bk. page for etudes • Determine grade level proficiency • 8th grade = CHRCAP grade 2 • Design your own assessment form (69% = prof.?)

  28. Excellence in Theory or Standard of Excellence Music Theory & History Workbooks • Kjos - publisher • 3 volumes (see handout sample) • Includes theory, ear training, history • Take MS & HS to complete 3 volumes • Students work on lessons during down time in rehearsal • Establish grade level expectations and written exam

  29. Locally Designed Assessments • Create an annual music assessment • Based on standards & curriculum • Progressive by grade • Written & performance • See examples in handout for gr. 3 and HS orchestra

  30. Rubistarhttp://rubistar.4teachers.org/ • Create rubrics using existing descriptors • Search other teachers’ rubrics for samples • Edit to fit your needs

  31. Performance Quiz (Piano) Quiz #1 Scales Two octaves, hands together, ascending and descending Keys ____________

  32. Resources Music Assessment Web Site, created by Ed Asmus • http://www.music.miami.edu/assessment/ • Provides forms, glossary, rubrics, templates, software and links. McGraw Hill Web Site • http://spotlightonmusic.macmillanmh.com/national/teachers • Provides links to free downloadable graphic organizers • Textbook series includes worksheets, quizzes, tests and other tools for assessing children’s musical skill and understanding Music Ace Software, by http://www.harmonicvision.com/. Among other things, it allows teachers to: • Import assessment data from earlier versions • Export assessment data in industry-standard format • Archive student and group assessment data

  33. Festival Ratings

  34. NAfME Position Statement • Successful music teacher evaluation must, where the most easily observable outcomes of student learning in music are customarily measured in a collective manner (e.g., adjudicated ratings of large ensemble performances), limit the use of these data to valid and reliable measures and should form only part of a teacher’s evaluation. (NAfME, 2011)

  35. Festival Ratings: Advantages • Provide quantitative third party assessment • Can show growth over time in some circumstances • Individual judges’ ratings • Repertoire difficulty • 3 yr. period • Valid to the extent that they measure the quality of an ensemble’s performance of three selected pieces & sight reading at one point in time • Likely reliable over 3-yr. period based on previous research • Probably adaptable to state-wide evaluation tool • Assess a few performance standards

  36. Ratings ≠ MEAP or MME Exams MEAP & MME • Same for all each yr. • Rel. and val. established • Many Standards • Individual • Mostly objective • Reflect multiple levels of achievement Ratings • Rep., adj. change • Val. & rel. not est. • Per. standards only • Group • Mostly subjective • 90%+ earn I or II out of V ratings.

  37. Festival/Contest Ratings: Challenges • Reliability • Curricular limitations • Score Inflation • Ratings Effectiveness in differentiating quality • Influence of non-performance factors • Group vs. Individual performance • Other factors • Role of MSBOA & MSVMA?

  38. Experiences

  39. Describe Your Situation • In roundtables by area? • How are you measuring student growth at your school? • What support are you getting? • What needs or concerns do you have?

  40. Presenter Contacts • Dr. Abby Butler, Wayne State University, (abby.butler@wayne.edu) • Rick Catherman, Chelsea Public Schools, (rcatherman@chelsea.k12.mi.us) • Dr. Phillip Hash, Calvin College, (pmh3@calvin.edu)

More Related