1 / 27

Performance Task Coaching Model

Performance Task Coaching Model. Faculty (and Allison Zmuda) zmuda@competentclassroom.com. Stage 2 – Assessment evidence. Stage 2 - Assessment Evidence. Performance Task(s):. Other Evidence:. T. OE. Require application of understanding, knowledge and skill

cree
Download Presentation

Performance Task Coaching Model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Task Coaching Model Faculty (and Allison Zmuda) zmuda@competentclassroom.com

  2. Stage 2 – Assessment evidence Stage 2 - Assessment Evidence Performance Task(s): Other Evidence: T OE • Require application of understanding, knowledge and skill • Ability to think strategically, make judgments, draw conclusions • Must do it on their own

  3. Definition of a Performance Task A performance task is every student’s opportunity to produce evidence of learning and strategic application through: • solving relevant problems • making sense of real life situations • pursuing curiosities (student-driven questions for research, analysis, communication and reflection)

  4. Connected to HSSIS Theory of Action • IF staff design rigorous forms of accountability that require student to think creatively and innovatively, test their own assumptions, and are transparent in terms of how we assess, give feedback, and provide opportunities for revision • THEN students will see the relevance of what they are asked to do and have a growth mindset about their work so that they can take ownership of their learning.

  5. Key questions to drive the design • What would real use of the content look like? • What should students ultimately be able to say and do with content if they get it? • How can we provide the necessary materials to foster independence?

  6. What should a good performance task empower students to do? • Provides students opportunities to demonstrate what they know and can do based on prior learning. • Allows students to make choices about a problem and identify a possible solution that connects to the world. • Utilizes past knowledge that the students have in order to transfer that knowledge for the problem that they are trying to solve. • Allows a student to connect with something that is relevant to their life and use knowledge to experience something new

  7. Between this afternoon and January 31… • Finish developing materials • Assign performance task • Collect samples of student work • Do a brief self-reflection of how it went

  8. Between January 31st and June 7th… • Do it again…

  9. Protocol • CONTEXT: Why did you choose this performance task? What inspired you? • GOALS: What does the performance task require? • Make a distinction between the purpose(the reason) and the product(in what form it will take). • How does it address transfer, meaning making, and acquisition? • How do you ensure that every student has an opportunity to show his or her learning? • How are enduring understandings and essential questions connected to this task? • How do you connect this to larger outcomes (NYS Learning Standards, ISSN High School Graduate)?

  10. Protocol • CRITERIA: What are clear criteria that students can use as a guide to measure and monitor performance? • PROCESS: How and when will the performance task be introduced? • What are the learning activities that you plan to prepare them to do this task on their own? (The goal is not to co-opt their independence. Do it in another context so that they still have to apply their learning.) • How do we structure feedback along the way opportunity to revise the performance task) based on the same set of criteria to support student learning? • How are we supporting kids in a low-stress environment (formative assessments) so that they can do it on an independent performance task?

  11. Seeing it in action • Fishbowl exercise to model what it looks like • Immediately after we will move into our breakout rooms • 9:45am- 10:45 Interdisciplinary groups • 11:00-12:00 Revise own performance task • 12 – 1pm Lunch

  12. Rubrics • Essential Question — How do I develop and use a rubric to measure what matters? • Understanding — Clarity of expectations and feedback along the way improves student achievement.

  13. A protocol using a set of scoring guidelines/criteria A scale that indicates different performance levels of proficiency Ideally accompanied by examples of products or performances illustrating the different performance levels A rubric is…

  14. Why design rubrics? • To give a clear picture of what quality looks like on the task • To reliably and efficiently report student performance • To provide student’s a meaningful opportunity to self-assess progress • To guide revision opportunities

  15. Why design rubrics?Student view • What might they say?

  16. Features of High-Quality Rubrics Stiggins, Richard J, Arter, Judith A., Chappuis, Jan, Chappius, Stephen. Classroom Assessment for Student Learning. Assessment Training Institute, Inc., Portland, Oregon, 2004, p. 201 and 203 • Content: What counts? • Clarity: Does everyone understand what is meant? • Practicality: Is it easy to use by teachers and students? • Technical quality/fairness: Is it reliable and valid?

  17. Rubric Types • Rubrics may be used “holistically” or “analytically”… • “Holistic” Rubric: • Response is evaluated and scored as a single performance category • “Analytical” Rubric: • Response is evaluated with multiple descriptive criteria for multiple performance categories

  18. Establishing criteria to evaluate student work • Content • Degree of knowledge of information, understanding of concepts, principles and processes • Process • Degree of skill or proficiency; effectiveness of process or method used • Quality • Degree of quality evident in product or performance • Result • Overall impact to which goal was achieved

  19. Content sample indicators • Accurate • Appropriate • Authentic • Complete • Correct • Credible • Explained • Justified • Important • In-depth • Insightful • Logical • Makes connections • Precise • Relevant • Sophisticated • Supported • Thorough • Valid

  20. Process sample indicators • Careful • Clever • Coherent • Collaborative • Concise • Coordinated • Effective • Efficient • Flawless • Followed process • Logical or reasoned • Mechanically correct • Methodical • Meticulous • Organized • Planned • Purposeful • Rehearsed • Sequential • Skilled

  21. Quality sample indicators • Attractive • Competent • Creative • Detailed • Extensive • Focused • Graceful • Masterful • Neat • Novel • Organized • Polished • Precise • Proficient • Rigorous • Skilled • Stylish • Smooth • Unique • Well crafted

  22. Result sample indicators • Beneficial • Conclusive • Convincing • Decisive • Effective • Engaging • Entertaining • Informative • Inspiring • Meets standards • Memorable • Moving • Persuasive • Proven • Responsive • Satisfactory • Satisfying • Significant • Understood • Useful

  23. Starting to build the rubric • Degrees of Understanding • 4: Thorough and complete • 3: Substantial • 2: Partial or incomplete • 1: Misunderstanding or serious misconceptions • Degrees of Independence • 4: Independently • 3: With minimal assistance required • 2: With moderate assistance required • 1: With considerable assistance required

  24. Starting to build the rubric • Degrees of Clarity • 4: Exceptionally clear, easy to follow • 3: Generally clear, able to follow • 2: Lacks clarity, difficult to follow • 1: Unclear, impossible to follow • Degrees of Accuracy • 4: Completely accurate, all correct • 3: Generally accurate, minor inaccuracies don’t affect overall result • 2: Inaccurate, numerous errors detract from result • 1: Major inaccuracies, significant errors throughout

  25. Starting to build the rubric • Degrees of Effectiveness • 4: Highly effective • 3: Generally effective • 2: Somewhat effective • 1: Ineffective • Degrees of Frequency • 4: Always or consistently • 3: Frequently or generally • 2: Sometimes or occasionally • 1: Rarely or never

  26. Constructing a Rubric • Take your existing idea for a summative performance assessment • Look at the Stage 1 goals (Standards, Enduring Understandings, Knowledge and Skills) to determine the primary criteria for the rubric • Develop descriptive criteria for each level of performance (recommend using 4 points)

  27. Rubric samples • http://ohiorc.org/standards%5Ffirst/ • http://wvde.state.wv.us/teach21/SecondaryRubrics.html • http://www.greece.k12.ny.us/academics.cfm?subpage=1369 • http://wvde.state.wv.us/teach21/WVDERubrics.html • http://jfmueller.faculty.noctrl.edu/toolbox/index.htm • http://www.iac.pdx.edu/content/examples-rubrics • http://www.uwstout.edu/soe/profdev/rubrics.cfm • http://www.ncsu.edu/midlink/rub.mm.st.htm • http://course1.winona.edu/shatfield/air/rubrics.htm • http://www.utexas.edu/academic/ctl/assessment/iar/students/report/rubrics-research.php

More Related