1 / 56

Construct Driven Assessment Design: NCLT Reality or Lake Michigan Pipe Dream?

Construct Driven Assessment Design: NCLT Reality or Lake Michigan Pipe Dream?. Jim Pellegrino, Joe Krajcik Shawn Stevens & Namsoo Shin. Overview of the Next Two Days. Why we’re here & what we’’ll do Construct-driven assessment - an introduction Activity 1: Unpacking the “Construct”

pillan
Download Presentation

Construct Driven Assessment Design: NCLT Reality or Lake Michigan Pipe Dream?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Construct Driven Assessment Design:NCLT Reality orLake Michigan Pipe Dream? Jim Pellegrino, Joe Krajcik Shawn Stevens & Namsoo Shin

  2. Overview of the Next Two Days • Why we’re here & what we’’ll do • Construct-driven assessment - an introduction • Activity 1: Unpacking the “Construct” • Activity 2: Translation into Claims & Evidence • Reporting out of progress • Activity 3: Creating an Assessment Blueprint • Reporting out of progress • Wrapup & Next Steps • Tasks and tools • Final thoughts

  3. Professional Development Concept Inventory Self- Assembly Learning Progression Nanoconcepts The “Need” Developing good assessments for evaluating student understanding is critical for all aspects of Work in the NCLT Therefore, we need a consistent, principled way for developing assessments

  4. Examples of assessmentas a driver of NCLT Work • Concepts & Concept Inventories • • using assessments as diagnostic aids for instructors • Learning Progression • • assessing student knowledge to understand how • students’ knowledge builds over time • • develop assessments from learning progressions • Curriculum Development • • how do you know that your curriculum materials, • technology tool, etc. was successful? • • how do you provide feedback for learners? • Professional Development • • how do we assess teacher learning? • • how do we provide feedback for learners?

  5. Workshop Activities & Objectives • For a few of the “big ideas” in nanoscience and nanotechnology: • Unpack each one to define the “construct” to be assessed and the relevant “claim space” about student knowledge and understanding • Decide what would serve as appropriate “evidence” that a student has the desired knowledge. • Design an “assessment blueprint” that includes the range of tasks, questions and/or situations needed to provide the types of evidence you want and need for a particular assessment purpose.

  6. Unpacking the domain Conceptual Assessment Framework Assessment Implementation Assessment Delivery Layers in the assessment enterprise • What is important about this domain? • What work and situations are central in this domain? • What KSAsare central to this domain? • How do we represent key aspects of the domain in terms of assessment argument. • Design structures: domain model (claims), evidence, and task models • How do we choose and present tasks, and gather and analyze responses? • How do students and tasks actually interact? • How do we report examinee performance?

  7. Construct-Driven Assessment What complex of knowledge, skills, or other attributes should be assessed? What behaviors or performances should reveal those constructs? What tasks or situations should elicit those behaviors? (Messick, 1994)

  8. Assessment as a Process of Reasoning from Evidence • cognition • model of how students represent knowledge & develop competence in the domain • observations • tasks or situations that allow one to observe students’ performance • interpretation • method for making sense of the data interpretation observation cognition Must be coordinated!

  9. Scientific Foundationsof Educational Assessment • Advances in the Sciences of Thinking and Learning -- the cognition vertex • informs us about what observations are important and sensible to make • Contributions of Measurement and Statistical Modeling -- the interpretation vertex • Informs us about how to make sense of the observations we have made

  10. ItemsDesign (Observation) ConstructMap (Cognition) ItemScores (Interpretation) MeasurementModel (Interpretation) An Example: Wilson’s Approach

  11. Building Block 1: Construct Map • Developmental perspective • assessment system should be based on a developmental perspective of student learning • Progress variable as a representation • Visual metaphor for • how the students develop knowledge & understanding and • how we think about how their item responses might change

  12. Example: Why things sink and float Example provided by:

  13. Building Block 2: Items design • Instruction & assessment match • there must be a match between what is taught and what is assessed • Items design • a set of principles that allows one to observe the students under a set of standard conditions that span the intended range of the item contexts Example- Please answer the following question. Write as much information as you need to explain your answer. Use evidence, examples and what you have learned to support your explanations. Why do things sink and float?

  14. Building Block 3: Outcome space • Management by teachers • that teachers must be the managers of the system, and hence must have the tools to use it efficiently and use the assessment data effectively and appropriately • Outcome space • Categories of student responses must make sense to teachers

  15. Example: Why things sink and float

  16. Building Block 4: Measurement model • Evidence of quality • reliability and validity evidence, evidence for fairness • Measurement model • multidimensional item response models, to provide links over time both longitudinally within cohorts and across cohorts

  17. Example: Evaluate progress of a group OT UF PM M V MV D RD

  18. Evaluate a student’s locations over time Embedded Assessments

  19. Why Models of the Development of Domain Knowledge are Critical • Tell us what are the important aspects of knowledge that we should be assessing. • Give deeper meaning and specificity to standards • Give us strong clues as to how such knowledge can be assessed • Suggest what can and should be assessed at points proximal or distal to instruction • Can lead to assessments that yield more instructionally useful information -- diagnostic & prescriptive • Can guide the development of systems of assessments -- work across levels & time

  20. Scientific Inquiry and Reasoning Big Ideas of XXXXX Continuity and Change Scientific Investigation Technology Design Structure/Function Problem Solving Critical Thinking Systems Models Scale Science Explains The Real World Unifying Concepts Application to the Real World AP RedesignProcess Design Multipart Framework for the Domain Analysis Course Goal Integrated Knowledge

  21. Goals: Traditional emphasis on calculation and description shifts to an emphasis on the underlying concepts and the reasoning from which they emerge. Traditional emphasis on teacher-directed procedures is an early scaffold that supports subsequent experiences involving guided inquiry. Asking testable questions, drawing conclusions based on evidence, and generating useful representations Using symbolic and graphical representations of relationships Models Chemistry Domain Analysis Applying the scientific way of knowing through reasoning based on evidence. Systems Unifying Concepts Scientific Inquiry and Reasoning Continuity and Change Organizing and communicating ideas Scale Course Goal Experimental design, execution and data analysis Structure/Function Integrated Knowledge Science Explains the Real World Any bond or intermolecular attraction that can be formed can be broken. These two processes are in a dynamic competition, sensitive to initial conditions and external perturbations. Matter is made from discrete, fundamental units called atoms. Big Ideas of Chemistry Chemical and physical properties of materials can be explained by the structure and the arrangement of atoms, ions or molecules and the forces between them. Rates of chemical reactions are determined by details of the molecular collisions. The laws of thermodynamics explain and predict the direction of changes in matter. Changes in matter involve the rearrangement and/or reorganization of atoms and/or the transfer of electrons.

  22. Chemistry Levels 1 & 2

  23. Chemistry Levels 1 & 2 (cont)

  24. Beneath the seven big ideas of content lie 40 enduring understandings whose scope has been defined with 143 level-3 concepts describing what is and is not in the course. Strategic skills that support the identification and solution of problems Measurement and interpretation of measurement Constructing and interpreting visual and graphical representations of relationships Models Physics Domain Analysis Systems Unifying Concepts Applying mathematical reasoning Scientific Inquiry and Reasoning Continuity and Change Drawing conclusions based on evidence Scale Course Goal Structure/Function Experimental design Integrated Knowledge Science Explains the Real World The evolution of a complex system is determined by the probability of its configuration. Big Ideas of Physics Objects and systems have properties such as mass, charge, and internal structure. The interactions of one system with another system can be mediated by waves. Fields existing in space can be used to explain interactions. The interactions of one object or system with another is described by dynamics. The interactions of one object or system with another can be described by conservation laws. Interactions among systems can result in changes to those systems.

  25. Physics Levels 1 & 2

  26. Physics Levels 1 & 2 (cont)

  27. The structure of a single Big Idea with 8 learning objectives

  28. Each learning objective is given meaning with descriptions at one more level of detail

  29. AP Domain Reporter Questions

  30. Example: What does it mean to understand the particle model of matter for solids? • - that all matter is made of particles • that in solids, the particles (atoms) are arranged in • an ordered compact way • that the particles (atoms) are in constant motion; the • degree of motion is dependent on temperature • that the particles (atoms) are the fundamental • building blocks of matter • that all particles (atoms) of the same type are the • same shape and size • that the arrangement of particles (atoms) determines • the substance and can affect its properties

  31. Your First Major Assignment • For your “big idea,” begin to define the construct • What does it mean to “know & understand” this? • What’s appropriate for students at Level X to understand (as contrasted with domain experts) • What is the possible prerequisite or co-requisite knowledge • What are some possible student misconceptions and/or difficulties • Use the “big ideas” materials already provided • Try to think in terms of Learning Progressions • Do it verbally, graphically, or however it seems to make sense for your group. • Be prepared to upload your work to the wiki

  32. Exactly what knowledge do you want students to have and how do you want them to know it? What task(s) will the students perform to communicate their knowledge? Evidence-Centered Design What will you accept as evidence that a student has the desired knowledge? How will you analyze and interpret the evidence? claim space OR construct evidence task Frase, L.T., Chudorow, M., Almond, R.G., Burstein, J., Kukich, K., Mislevy, R.J., Steinberg, L.S., & Singley, K. (in press). Technology and assessment. In H.F. O'Neil & R. Perez (Eds.), Technology applications in assessment: A learning view.

  33. Possible claims about what a student understands about the particle model • The Student understands…… • that all matter is made of particles • that in solids, the particles (atoms) are arranged in • an ordered compact way • that the particles (atoms) are in constant motion; • the degree of motion is dependent on temperature • that the particles (atoms) are the fundamental • building blocks of matter • that all particles (atoms) of the same type are the • same shape and size • that the arrangement of particles (atoms) • determines the substance and can affect its properties

  34. Evidence What would you accept as appropriate evidence that a student has the desired knowledge? What does it mean to know that? What kind of behaviors or performances are necessary for the students to demonstrate that he or she has that knowledge? • To do this you need to use words like: • state analyze • explain apply • evaluate model

  35. AP Translation Process

  36. Elaboration of a Claim

  37. AP RedesignModel of Knowing & Learning A claim has the general form: The student is able to <Verb & verb clauses> the enduring understanding within <contexts derived from the Commission’s scope-delimiting report> Evidence supporting a claim is expressed as a work product whose features are defined by these reports.

  38. Elaboration of Evidence

  39. AP RedesignModel of Knowing & Learning Each concept description generates multiple claims: Student work can provide evidence to support the claim. The features of this evidence are taken from the concept descriptions:

  40. Your Next Major Assignment • For the “big idea” that you have begun to unpack and elaborate now try to specify: • What possible claims about student knowledge, skill, and understanding you would want and/or need to make. • For each claim state it as precisely as you can. • What are major claims and minor claims? • For whom is such a claim appropriate? • State as precisely as you can the nature of the evidence that is needed to support a claim • Be sure to consider the mapping between claims and evidence • Do this as systematically as possible. • Be prepared to upload your work to the wiki

  41. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptual Assessment Framework Design structures: Student, evidence, and task models How do we choose and present tasks, and gather and analyze responses? Assessment Implementation Assessment Delivery How do students and tasks actually interact? How do we report examinee performance? • From Mislevy & Riconscente, in press Layers in the assessment enterprise

  42. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptual Assessment Framework Design structures: Student, evidence, and task models How do we choose and present tasks, and gather and analyze responses? Assessment Implementation Assessment Delivery How do students and tasks actually interact? How do we report examinee performance? • From Mislevy & Riconscente, in press Cog Psych, expertise studies, domain research

  43. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptual Assessment Framework Design structures: Student, evidence, and task models How do we choose and present tasks, and gather and analyze responses? Assessment Implementation Assessment Delivery How do students and tasks actually interact? How do we report examinee performance? • From Mislevy & Riconscente, in press Assessment Argument

  44. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptual Assessment Framework Design structures: Student, evidence, and task models How do we choose and present tasks, and gather and analyze responses? Assessment Implementation Assessment Delivery How do students and tasks actually interact? How do we report examinee performance? • From Mislevy & Riconscente, in press • Explicit connection to domain & purpose • Generative structures for recurring kinds of proficiencies (e.g., inquiry cycles, troubleshooting) across projects • PADI Design patterns

  45. Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? Domain Modeling How do we represent key aspects of the domain in terms of assessment argument. Conceptual Assessment Framework Design structures: Student, evidence, and task models How do we choose and present tasks, and gather and analyze responses? Assessment Implementation Assessment Delivery How do students and tasks actually interact? How do we report examinee performance? • From Mislevy & Riconscente, in press Generative Design Schemas

More Related