1 / 23

UbD Think like an Assessor Stage 2

UbD Think like an Assessor Stage 2. Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center rmayes2@uwyo.edu. Assessor – 3 basic questions. What kind of evidence do we need to support the attainment of goals?

uttara
Download Presentation

UbD Think like an Assessor Stage 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UbD Think like an Assessor Stage 2 Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center rmayes2@uwyo.edu

  2. Assessor – 3 basic questions • What kind of evidence do we need to support the attainment of goals? • Tasks that reveal understanding, such as comparing and contrasting or summarizing key concepts • What specific characteristics in student responses, products, or performances should we examine to determine the extent to which the desired results were achieved? • Criteria, rubrics, and exemplars are needed • Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding? • Validity and reliability concerns

  3. Stage 2: EvidenceThink like an assessor not an activity designer Assessor Activity Designer

  4. Stage 2: EvidenceThink like an assessor not an activity designer Assessor Activity Designer

  5. Continuum of Assessment Methods • Vary in several characteristics • Scope: from simple to complex • Time Frame: short-term to long term • Setting: decontextualized to authentic • Structure: highly structured to ill-structured • Move from snapshot to scrapbook • Self-assessment of sources of evidence (HO) Performance Task Informal checks Observation/ Dialogue Quiz/ Test Academic Prompt

  6. Collecting a Range of Evidence • Activity: (HO) determine a range of assessment evidence you may use related to the • Enduring understanding • Topics important to know and do • Worth being familiar with • Which assessment methods best fit the 3 categories? Worth being familiar with Important to know and do Enduring Understanding

  7. Academic Prompt Assessments • Open-ended question or problem that require student to prepare a specific academic response • Think critically and prepare response • Require constructed response under exam conditions • Divergent – no single best answer • Subjective judgment based scoring using criteria or rubric • May or may not be secure • Often ill-structured – require development of strategy • Involve analysis, synthesis, and evaluation

  8. Performance TaskAssessments • Complex challenges that mirror the issues and problems faced by adults • Real or simulated settings, authentic • Require student to address audience in non-exam conditions • Divergent – no single best answer • Subjective judgment based scoring using criteria or rubric, • Greater opportunity to personalize task • Not secure – students given criteria in advance

  9. Performance Task – 6 Facets • Activity: Use the 6 Facets of Understanding to generate a performance task related to your enduring understanding • Questioning for Understanding (HO) • Performance Verbs (HO) • Performance Task creation (HO) • Performance Task brainstorming (HO)

  10. Performance Task -GRASPS • Creating a performance task with context and roles • Goal • Role • Audience • Situation • Product, Performance, and Purpose • Standards and Criteria for Success

  11. Performance Task -GRASPS • Activity: Create a performance task using GRASPS • GRASPS Performance Task Scenario (HO) • Student roles and audiences (HO) • Possible Products and Performances (HO)

  12. Assessor Question 2: Determine achievement • What specific characteristics in student responses, products, or performances should we examine to determine the extent to which the desired results were achieved? • Criteria, rubrics, and exemplars are needed

  13. Designing Scoring Rubrics • Rubric: criterion-based scoring guide for evaluating a product or performance along a continuum. • Consists of: • Evaluative Criteria – qualities that must be met for work to measure up to a standard • Fixed Measurement Scale – often 4 or 5 levels • Indicators – descriptive terms for differentiating among degrees of understanding, proficiency, or quality

  14. Rubric Types • Holistic – provide an overall impression of the elements of quality and performance levels in a student’s work • Analytic – divides a student’s performance into two or more distinct dimensions (criteria) and judges each separately • Recommend use of analytic with a minimum of: • Criteria for understanding (HO) • Criteria for performance • Using Facet-Related Criteria (Figure 8.3, Pg 178)

  15. Rubric Types • Generic – general criteria in given performance area • Can be developed before specific task defined • Example: General Problem Solving Rubric • Example: Generic Rubric for Understanding (HO) • Task-Specific – designed for use with particular assessment activity • Task dependent so cannot be used to evaluate related performance tasks

  16. Rubric Types • Longitudinal Rubric – progression from naïve to sophisticated understanding • Increased understanding of complex functions and interrelatedness of concepts • Greater awareness of how discipline operates • Greater personal control over and flexibility with knowledge

  17. Effective Rubrics • Relate specific task requirements to more general performance goals • Discriminate among different degrees of understanding or proficiency according to significant features • Do not combine independent criteria in one column of rubric • Use Student Anchors to (Anchor design, Pg 181) • Set standards based on student artifacts • Consistency in judgment of student work • Equip students to do more accurate and productive self-assessment

  18. Effective Rubrics • All potential performances should fit somewhere in rubric • Rely on descriptive language (what quality looks like) not comparative or value language to make distinctions • Avoid making lowest score point sound bad, should describe novice or ineffective performance • Highlight judging performance’s impact as opposed to over rewarding just process or effort

  19. Assessor Question 3: Valid and Reliable • Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding? • Validity: did we measure what we meant to measure • Does the evidence indicate understanding of the expressed outcomes? • Are the performances appropriate to the understanding sought? • Do not pay so much attention to correctness that degree of understanding is lost.

  20. Validity • Two key validity questions for assessment tasks: • A student could do well on this performance task, but really not demonstrate the understanding you are after? • A student could perform poorly on this task, but still have significant understanding of the ideas and show them in other ways? • Activity: determining validity (Figure 8.5)

  21. Validity • Two key validity questions for rubric: • Could the proposed criteria be met but the performer still not demonstrate deep understanding? • Could the proposed criteria not be met but the performer nonetheless still show understanding?

  22. Reliability • Reliable assessments reveal a credible pattern, a clear trend • Need for multiple evidence (scrapbook) rather than just a snapshot of student performance • Have parallel assessments on the same concept using multiple assessment formats.

  23. Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center rmayes2@uwyo.edu

More Related