revised bloom s taxonomy and designing effective rubrics n.
Skip this Video
Loading SlideShow in 5 Seconds..
Revised Bloom’s Taxonomy and Designing Effective Rubrics PowerPoint Presentation
Download Presentation
Revised Bloom’s Taxonomy and Designing Effective Rubrics

Revised Bloom’s Taxonomy and Designing Effective Rubrics

441 Views Download Presentation
Download Presentation

Revised Bloom’s Taxonomy and Designing Effective Rubrics

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Revised Bloom’s Taxonomy and Designing Effective Rubrics Jeffrey A. Greene, Ph.D. Assistant Professor of Educational Psychology, Measurement, and Evaluation School of Education University of North Carolina at Chapel Hill May 27, 2010

  2. Our Learning Objectives • Understand Revised Bloom’s Taxonomy • Relation to learning outcomes and assessment • Be able to create a Taxonomy Table • Understand purposes of formative and summative assessment • Be able to create effective rubrics

  3. Bloom’s Taxonomy

  4. Purpose of Revising Bloom’s Taxonomy • Align educational objectives, instruction, and assessment • Enable teachers to prepare students for state-wide standardized assessments without “teaching to the test” • Expand education beyond memorization to “higher-order” cognition • Highlight that different types of objectives require different types of assessment

  5. Revised Bloom’s Taxonomy Verbs (Cognitive processes) Nouns (Kinds of Knowledge)

  6. Kinds of Knowledge • Factual/Declarative: “what” knowledge • Conceptual: connections, relations among “what” knowledge • Also classifications, generalizations, theories, models • Procedural: “how” knowledge • Domain-general, subject-specific • Metacognitive: “when” or “where” or “for what reason” knowledge • Knowledge of strategies • Knowledge of tasks • Monitoring understanding • Self-regulating learning

  7. Cognitive Processes • Remember • Recognizing, recalling • Understand • Interpreting, exemplifying, classifying, summarizing, inferring, comparing, explaining • Apply • Executing, implementing • “Near transfer”

  8. Cognitive Processes • Analyze • Breaking down into parts, differentiating, organizing, attributing • Requires conceptual knowledge • “What happens if the system breaks down?” • Evaluate • Making judgments, checking, critiquing • Create • Generating, planning, producing novel products

  9. Taxonomy Table

  10. Taxonomy Table Standard 301.b.1 “Knowledge and understanding of substantive law…necessary for effective and responsible participation…” Question: what does “understand” really mean in this standard? Apply, analyze, evaluate?

  11. Let’s Create a Taxonomy Table • We need a learning objective

  12. Curriculum-Wide Taxonomy Table Standard 302.b.i “Proficiency as an entry level practitioner in legal research”

  13. Challenges to Revised Taxonomy • What happens when learning objectives, instruction, assessment do not line up? • Objectives and instruction without assessment • Instruction and assessment without objectives • Objectives and assessment without instruction • Does this all take too much time?

  14. Connection to Objectives and Instruction • Ideally objectives use verbs from taxonomy • 19 cognitive processes • Can put instructional methods in boxes as well • What content/activities/skills/knowledge/beliefs are necessary to promote this noun/verb combo? • What combinations of nouns/verbs need to be taught in-class, and what can be done through technology? • Declarative knowledge / remember, understand • Classtime = practice, conceptual knowledge

  15. Connection to Assessment • Standard 304.a.2 “Employ a variety of assessment methods and activities, consistent with effective pedagogy, systematically and sequentially throughout the curriculum…” • Every learning objective in the taxonomy table should have an assessment (or part of an assessment) assigned to it • Emphasis on variety of cognitive processes and formative feedback requires expanded list of assessment types • Classification task “Is X an example of class Y?” • Essays • Experiential / performance-based assessments • Professor Penny Wilrich’s presentation this morning

  16. Additional Assessment Challenges • How do you assess procedural, conceptual, metacognitive knowledge? • How do you assess 302.b.1.i “Legal analysis and reasoning, legal research, problem solving…etc?” • How do you assess emotional intelligence, collaboration skills, ability to work in a hierarchy • Requires: • Assessment conceptualized as both formative and summative • Ways of scoring constructed response items • Rubrics

  17. Formative and Summative Assessment • Standard 304.a.3: “Provide feedback to students periodically and throughout their studies about their progress in achieving its learning outcomes” • This morning: “People don’t like to be assessed” • Assessment culture • Formative: diagnostic, measure of developing knowledge • Informative feedback • Repeated assessment: same skills, different contexts • Often ungraded or self-graded • Summative: evaluative, indication of achievement

  18. Rubrics • Rubrics help clarify important assessment criteria for ill-structured tasks • Performances, debates, presentations • Help in accuracy, reliability of scoring • Rubrics should be specific to assignment • Provide space for narrative comments

  19. 4 Key Dimensions of Rubrics • Evaluative criteria for scoring: should follow from learning objectives • Quality definitions: determine how to discriminate between good, adequate, and poor performance on criterion • Scoring strategy: • Holistic: judge entire product using combined criteria • Analytic: score criteria individually, then aggregate for final score • Scorer(s): Professor, peers, self, others?

  20. Example Essay Rubric – Analytic, Professor Scored

  21. Rubric Problems • Too general: “A superior essay is one in which the essay itself is written in a high-quality, excellent manner. An inferior essay is typified by poor quality. An adequate essay falls between the superior and inferior essays in terms of organization.” • Too specific (and detailed): “A superior essay has (1) a specific statement about the two major components of rubrics, (2) a discussion of the two types of scoring strategies…”

  22. Exercise: Developing Rubrics • Evaluative Criteria • Quality Definitions

  23. Developing and Using Rubrics • By professor: iterate and improve • In collaboration with students • Time consuming • Can build trust, understanding • When to distribute rubrics: • With assignment: can help with formative assessments • With grades: justifies summative assessments

  24. Advanced Assessment Challenges • 302.b.2.ii “Ability to recognize and resolve ethical and other professional dilemmas” • Recognize: “Remember” cognitive process • Resolve: “Apply” cognitive process • Practice assessments? Role plays?

  25. Questions? • How can taxonomy tables and rubrics be used: • Within classes • Across classes • Across curricula

  26. References Airasian, P. W., & Miranda, H. (2002). The role of assessment in the revised taxonomy. Theory Into Practice, 41(4), 249-254. Anderson, R. S., & Puckett, J. B. (2003). Assessing students’ problem-solving assignments. New Directions for Teaching and Learning, 95, 81-87. Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory Into Practice, 41(4), 212-218. Lalley, J. P., & Gentile, J. R. (2008). Classroom assessment and grading to assure mastery. Theory Into Practice, 48(1), 28-35. Rhodes, T. L. (Ed.) (2010). Assessing outcomes and improving achievement: Tips and tools for using rubrics. Association of American Colleges and Universities. Tractenberg, R., E., Umans, J., & McCarter, R. (2010). A mastery rubric: Guiding curriculum design, admissions and development of course objectives. Assessment & Evaluation in Higher Education, 35(1), 15-32.

  27. More General Resources Assessment & Evaluation in Higher Education journal: Association of American College and Universities resources on assessment: ETS “Culture of Evidence” Reports: