1 / 40

Taxonomic Levels And Rubrics

Taxonomic Levels And Rubrics. Desired Outcomes. An awareness of taxonomic levels and its purpose An awareness of the relationship of HCPS III benchmarks and taxonomic levels An opportunity to match benchmarks and tasks to the taxonomic levels

olisa
Download Presentation

Taxonomic Levels And Rubrics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Taxonomic Levels And Rubrics MMann/SAS

  2. Desired Outcomes An awareness of taxonomic levels and its purpose An awareness of the relationship of HCPS III benchmarks and taxonomic levels An opportunity to match benchmarks and tasks to the taxonomic levels An awareness of various types of performance assessment rubrics MMann/SAS

  3. Why do I need to know the taxonomic levels? Aligning our instruction and assessment to the targets. MMann/SAS

  4. Alignment – congruence or match between curriculum, instructionandassessment Curriculum Based on GLOs & HCPS III STUDENT ACHIEVEMENT Instruction Implementation of the curriculum Assessment Multiple measures of proficiency of the curriculum What How How Well MMann/SAS

  5. Research on Aligning curriculum with standards and assessment shows a strong relationship to student achievement. (Prince-Baugh, 1997; Mitchell, 1998; Wishnick, 1989) MMann/SAS

  6. Standards Implementation Process Model Student involvement throughout the process. Teacher collaboration throughout the process. Identify relevant content standards Determine acceptable evidence and criteria Determine learning experiences that will enable students to learn what they need to know and to do Teach and collect evidence of student learning Assess student work to inform instruction or use data to provide feedback Evaluate student work and make judgment on learning results and communicate findings Reteach, or repeat the process with the next set of relevant standards MMann/SAS Adapted from WestEd’s Learning from Assessment

  7. All targets, curriculum, instruction, activities and assessments involve some level of thinking. • Definition = the science or technique of classification Taxonomy MMann/SAS

  8. Cognition Type • or “cognitive demand” - generally refers to a taxonomy and reflects a classification of thinking rather than a sequential hierarchy. (understanding prior to application and analysis) • Cognitive demand is determined by analyzing the context of the lesson. (What support is provided and what are the students being asked to do?) MMann/SAS

  9. Adapt or adopt a systematic method for assigning performance expectations.McREL Taxonomy of objectives = a system for identifying distinct levels of difficulty. • Bloom’s • Guilford’s • Three-Story Intellect • Marzano’s MMann/SAS

  10. Marzano’s Taxonomic Levels • Level 1: Retrieval - recall, execution • Level 2: Comprehension - integrating,representation, symbolizing • Level 3: Analysis - matching, classifying, error analysis, generalizing, specifying • Level 4: Utilization - decision making, problem solving,experimental inquiry, investigation MMann/SAS

  11. Not used with performance standards, but part of taxonomy • Level 5: Metacognitive System - goal setting, process monitoring, monitoring clarity, monitoring accuracy • Level 6: Self System - examining importance, examining efficacy, examining emotional response, examining motivation. Adapted from Marzano (2001). “Designing A New Taxonomy of Educational Objectives”. MMann/SAS

  12. The Three Systems and Knowledge New Task Self-System • Decides to engage Continue current behavior Metacognitive System • Sets goals and strategies Cognitive System • Processes relevant information Knowledge From Marzano (2001). Designing a New Taxonomy of Educational Objectives

  13. Marzano’s Taxonomic Levels (Cognitive System) MMann/SAS

  14. MMann/SAS

  15. StandardsImplementationProcessModel Evaluate Student Work and Make Judgment on Learning Results and Communicate Findings Ï Assess Student Work to Inform Instruction or Use Data to Provide Feedback Î Teach and Collect Evidence of Student learning Í Congruence of Content, Context & Cognitive Demand Determine Learning Experiences that will Enable Students to Learn what they need to Know & Do ¸ Determine Acceptable Evidence and Criteria · Identify Relevant Standards  MMann/SAS

  16. Let’s Identify Taxonomic Levels • Retrieve (Marzano) • Recognize, Recall, Execute • Comprehension • Integrate, Symbolize • Analysis • Match, Classify, Analyze, Generalize, Specify • Knowledge Utilization • Decision Making, Problem Solving, Inquire Experimentally, Investigate MMann/SAS

  17. Level 1 - Knowledge Retrieval Level 3 - Analysis MMann/SAS

  18. Drill and Practice • Work with your table group • Read each card • Group by taxonomic level MMann/SAS

  19. Level of thinking helps determine the appropriate assessment method MMann/SAS

  20. Analyze plot, setting, characterization, or conflict to interpret theme in a literary text. • Describe the setting of the story. • Explain how the author uses his characters to convey a message. • Compare the plot of this story to the plot of the previous story. • Chooses a literary element (e.g., conflict). Describes how the author treats this element in the story. Assess how this element relates to the theme. MMann/SAS

  21. Knowing a taxonomy also helps in… scaffolding instruction. Create Compare Explain Identify MMann/SAS

  22. Three Tiers of Skill and Assessment Work Thanks to Heidi Hayes Jacobs • Drill & Practice • Rehearsal & Scrimmage • Authentic Performance MMann/SAS

  23. The level of thinking in the benchmark is the level of thinking required to meet proficiency. MMann/SAS

  24. Balanced Assessment Model MMann/SAS

  25. Performance Assessment is • an assessment (product or performance) based on observation and judgment about its quality. • the activities, problems, projects, and assignments students are asked to perform. • anything from a special task at the end of instruction as in a culminating event, or naturally occurring events during regular instruction. MMann/SAS

  26. The Importance of Criteria “Teachers [frequently] ask the wrong question first … “What do we do?” - putting the focus immediately on designing tasks - when they need to ask, “What do we want kids to know and be able to do? How well? What does quality look like? [We} need to ask these questions very clearly first.” Mike Hibbard, Education Update, 38(4). p.5, ASCD, June, 1996. MMann/SAS

  27. To Know Criteria Requires ... • Being exposed to the criteria from the beginning of instruction. • Having terms defined. (lots of details) • Samples of strong and weak performances. • Practice with feedback using the vocabulary of the criteria. • Focused revision of work. • Practice articulating the vocabulary for quality and applying it to many situations. • Instruction consciously focused on subparts of the criteria. Judy Arter, ATI MMann/SAS

  28. What is a Rubric? • A scoring guide designed to provide constructive feedback to students • Designed to show how important elements of a task would look in a progression from less well developed to exceptional along a continuum (Tomlinson, 2003). • A Latin word that means “red.” MMann/SAS

  29. A Rubric = Dimensions (essential qualities) + Continuum (Scale) + Descriptors of points on the scale + Work samples illustrating those points. MMann/SAS

  30. Holistic Rubrics • Holistic rubrics have one performance expectation description at each numerical level on the rubric. • The product or performance is evaluated as a whole, and given a single score. • Used “to obtain the overall impression of the quality of a performance or product.” (Wiggins and McTighe, 1999) MMann/SAS

  31. Holistic Rubrics • Quicker to write and to use. • Summative because they evaluate work at the end of the process. • Fails to communicate to students, especially low performing students, what their shortcomings are MMann/SAS

  32. Holistic Rubric ExampleFiction Writing Content Rubric 5 The plot, setting, and characters are developed fully and organized well. The who, what, where, when, and why are explained using interesting language and sufficient detail. 4Most parts of the story mentioned in a score of 5 above are developed and organized well. A couple of aspects may need to be more fully or more interestingly developed. 3Some aspects of the story are developed and organized well, but not as much detail or organization is expressed as in a score of 4. 2A few parts of the story are developed somewhat. Organization and language usage need improvement. 1 Parts of the story are addressed without attention to detail or organization. MMann/SAS

  33. Analytical Rubrics • Use multiple descriptors for each criterion evaluated. • Type of “task analysis” where teachers award points on a criterion-by-criterion basis. • Described as teaching rubrics because their design helps students improve their own performance. MMann/SAS

  34. Analytic Rubric ExampleFiction Writing Content Rubric MMann/SAS

  35. Holistic Use : Quick snapshot of overall status When speed of scoring is important Simple products or performances Disadvantages: 2 students can get same score for different reasons Can’t identify strengths & weaknesses Not useful for students Analytical Use: Planning instruction - show relative strengths & weaknesses Details to teach student quality Detailed feedback Precision more important that speed: Disadvantages: Scoring is slower Takes longer to learn Holistic or Analytical Trait MMann/SAS

  36. Descriptive Terms for Differences in Degree • Degrees of Understanding • Degrees of Frequency • Degrees of Effectiveness • Degrees of Independence • Degrees of Accuracy • Degrees of Clarity MMann/SAS

  37. Descriptive Terms for Differences in Degrees MMann/SAS

  38. Options for Selecting Rubrics • Create your own - build from scratch • Adopt - use an existing rubric • Adapt - Modify or combine existing rubrics • Reword parts • Drop or change one or more scales • Omit irrelevant criteria • “Mix” and Match” rubrics • Change a holistic rubric into an analytic rubric • Modify for different grade levels MMann/SAS

  39. Guidelines for Rubrics Rubrics are effective when teachers utilize the following criteria: • Use specific numbers like “2” or “3 or more” rather than vague words like “some,” “many,” or “few.” • Use specific descriptors, rather than general descriptors like “good” or “excellent.” • Use the vocabulary of the standards and benchmarks. • State clear expectations for work so that all teachers, students, and parents know the criteria for quality and the requirements for earning a grade. Burke, 2006 MMann/SAS

  40. Resources • Anderson, L.,Krathwohl, D. et al. (2001). A Taxonomy for Learning, Teaching and Assessing. New York: Longman. • Curriculum Associates:Assessing Levels of Comprehension. • Lewin, L. & Shoemaker, B.J. (1998). Great Performances. Virginia: ASCD. • Marzano, R.J. (2001). Designing a New Taxonomy of Educational Objectives. Thousand Oaks: Corwin Press. • Popham, W. J. (2002). Classroom Assessment. Boston: Allyn & Bacon. • Stiggins, R.J. et al. (2004). Classroom Assessment for Student Learning. Portland: ATI. • Wahlstrom, D. (2002). Designing & Using High- Quality Paper-and-Pencil Tests. Virginia: Successline. • www.k12.wa.us/CurriculumInstruct/ pubdocs/WERA/WERA2005_Webversion.pp • http://www.stedwards.edu/cte/content/view/1536/49/ MMann/SAS

More Related