1 / 47

Using Progress Variables to Map Intellectual Development Cathleen A. Kennedy and Mark Wilson University of California a

Using Progress Variables to Map Intellectual Development Cathleen A. Kennedy and Mark Wilson University of California at Berkeley cakennedy@berkeley.edu. Outline. Outline. Purpose of the study Method BEAR Assessment System Calibration with Common Item Equating Setting Standards

holland
Download Presentation

Using Progress Variables to Map Intellectual Development Cathleen A. Kennedy and Mark Wilson University of California a

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Using Progress Variables to Map Intellectual Development Cathleen A. Kennedy and Mark Wilson University of California at Berkeley cakennedy@berkeley.edu

  2. Outline Outline • Purpose of the study • Method • BEAR Assessment System • Calibration with Common Item Equating • Setting Standards • Findings • Progress Variables • Items • Progress Guides • Calibration • Performance Standards • Mapping Intellectual Development • Next Steps

  3. Purpose Purpose of the Study Develop a framework for modeling intellectual development within a curricular unit: • Determine relevant constructs and instructionally useful performance levels • Construct formative assessments aligned with those levels • Establish performance expectations and associated cut-points • Demonstrate the use of graphical progress charts for formative feedback

  4. Purpose Context of the Study • Established FAST curriculum; 10-wk unit on Buoyancy • Established m/c pre/post test • Established instructional activities • ≈ 300 students across 8 CA schools Tasks: • Gather more useful assessment data • Formalize data gathering in selected performance activities • Extend some pre/post test items • Assess at least two learning dimensions: • A curriculum-specific content knowledge domain • A universal inquiry skill

  5. Method Method • Develop Assessments using BEAR* Assessment System • Determine Progress Variables • DesignItems • Develop Progress Guides • Establish Measurement Model • Calibrate with Common Item Equating • Set Performance Level Standards • Data * Berkeley Evaluation & Assessment Research Center

  6. Method: BAS BEAR Assessment System Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Lawrence Erlbaum Associates.

  7. Method: Calibration Calibrate with Common Item Equating Perf. Act. 1 Pretest Post Test Perf. Act. 2 Perf. Act. 3

  8. Method: Standard Setting Standard Setting Respondents with extensive knowledge. Define performance levels as logit ranges. Respondents with more knowledge. Respondents with some knowledge. Respondents with limited knowledge.

  9. Method: Standard Setting Standard Setting Respondents with extensive knowledge. Define performance levels as logit ranges. ? Respondents with more knowledge. ? Respondents with some knowledge. ? Respondents with limited knowledge.

  10. Method: Data Data • 8 CA middle schools • 1 teacher per school (most new to curriculum) • 14 classrooms • 220 students completed all assessments • Calibration • Student growth • 75 additional students completed post tests • Calibration

  11. Findings Findings • Developed Two Progress Variables • Buoyancy: WTSF (specific content) • Reasoning (generic inquiry) • Modified existing assessment activities to elicit evidence of the two variables • Developed progress guides for the two variables • Calibrated Progress Variables • Set Performance Levels • Mapped Student Growth

  12. Findings: Progress Variables Development of Progress Variables BEAR Assessment System Principle 1 • Classroom assessment system should be based on a developmental perspective of student learning. • Building Block: Progress Variables* • Defines low, high and intermediate levels • Visual metaphor for • how students develop and • how their responses change * Progress Variables are also referred to as Construct Maps in the BEAR Assessment System.

  13. Findings: Progress Variables Development of Progress Variables • How many variables? • Which are most appropriate for the purpose? • How many levels are needed on each variable?

  14. Mass Volume Density Graphing Process Inquiry Buoyancy Why things sink & float Building Explanations Findings: Progress Variables Candidate Progress Variables

  15. Mass Volume Density Graphing Process Inquiry Why things sink & float Building Explanations Findings: Progress Variables Final Two Progress Variables Buoyancy: WTSF Reasoning

  16. Findings: Progress Variables Useful Levels for Formative Feedback: WTSF

  17. Findings: Progress Variables Levels of WTSF from the Curriculum Understands relative density Understands density of matter Understands mass & volume Understands mass or volume alone

  18. Findings: Progress Variables More WTSF Levels after Reading Student Work Understands relative density Understands density of matter Understands mass & volume Understands mass or volume alone Misconceptions Unresponsive

  19. Findings: Items Design Modification of Assessments BEAR Assessment System Principle 2 • What is taught and what is assessed must be clearly aligned. • Building Block: Items Design • A framework for designing tasks to elicit specific kinds of evidence about student knowledge, as described in one or more progress variables, seamlessly integrated into the instructional activities of a course.

  20. Findings: Items Design Modification of Assessments • What is the purpose of assessing? • When should formal assessment take place? • What types of assessment tasks are appropriate?

  21. Findings: Items Design Critical Junctures in Instruction: WTSF Purpose is to determine readiness for next part of instruction. RL4 RL7 RL10

  22. Findings: Items Design Graphing Activity Below is a data table and graph for the Sinking-Straws activity. Explain in detail what the data and graph tell you about mass and depth of sinking. Use information and evidence from the graph as well as information from what you know. You may write and draw a picture to show your explanation.

  23. Findings: Items Design Predict-Observe-Explain Activity We have three bottles. We know the mass of each bottle. We already measured their displaced volumes in water. Record the mass and displaced volume of each bottle in Table 1. Based on what you know about mass and volume, in Table 2 PREDICT whether each bottle will float, sub-surface float, or sink when your teacher places the bottles into water. Explain why.

  24. Findings: Items Design Essay Activity Explain belowwhy things sink and float. Write as much information as you need to explain your answer. Use evidence and examples to support your explanation.

  25. Findings: Items Design Multiple Choice with Justification Items • 7. Amy put a wooden block into an overflow can filled with water. The block floats. She finds that the displaced water is 30 ml. What is the mass of the wooden block? • 10g • 30g • 60g • 90g Please explain why you chose that answer: ___________________________________ ___________________________________ ___________________________________

  26. Findings: Progress Guides Development of Progress Guides BEAR Assessment System Principle 3 • Teachers are the primary managers and users of assessment data. • Building block: Progress Guides* • Categories of student responses must make sense to teachers in the context of relevant progress variables. * Progress Guides are also referred to as the Outcome Space in the BEAR Assessment System.

  27. Findings: Progress Guides Level “MV” on WTSF Progress Variable For each performance level, define:

  28. Findings: Progress Guides Levels “RD” – “M,V” on WTSF

  29. Findings: Progress Guides Levels “PM” – “NR” on WTSF Similar coding guide developed for Reasoning variable.

  30. Findings: Progress Guides Reflective Lesson 10B You have six blocks. Blocks 1, 2 and 3 are made of one material, while blocks 4, 5 and 6 are made of another material. The density of blocks 1, 2 and 3 is 0.91 g/cm3, while the density of blocks 4, 5, 6 is 1.2 g/cm3. For each block, predict whether it will sink, float or subsurface float in water. Give your reasons for your predictions. Student Response Two Codes WTSF – D Reas. – P

  31. Findings: Calibration Calibration BEAR Assessment System Principle 4 • Classroom assessment requires reliability and validity evidence and evidence for fairness. • Building Block: Measurement model • Seeking interpretive quality • Multidimensional partial credit model in which • Order of item difficulties is the same for all respondents • Order of respondents is the same for all item subsets • One parameter model for interpretive needs

  32. Findings: Calibration Goal of Calibration: Interpretive Quality (1) • Person locations are interpreted in the context of item content (criterion referenced). Buoyancy depends on the mass of the object. RL4_A.Mass X Respondent Location Buoyancy depends on the object being flat, hollow, filled with air, etc. RL4_A.ProductiveMisconceptions

  33. Findings: Calibration Goal of Calibration: Interpretive Quality (2) • Student change derives meaning from items. Buoyancy depends on the density of the object. RL4_A.Mass&Volume Buoyancy depends on the mass and volume of the object. X Buoyancy depends on the mass of the object. RL4_A.Mass X Buoyancy depends on the object being flat, hollow, filled with air, etc. RL4_A.ProductiveMisconceptions

  34. Findings: Calibration Calibrated All Items onto Same Scale • Five forms of post test contained all items from all instruments. • Anchored Pretest, RL4, RL7 and RL10 item difficulties from post test calibration.

  35. Findings: Set Perf. Levels Set Performance Levels • Based on Thurstonian thresholds (50-50) • Find mean of thresholds for each step • Cut-point is midpoint between means 13.MV 6.MV X 4.M 16.M 3.M

  36. Findings: Set Perf. Levels WTSF Performance Levels

  37. Findings: Set Perf. Levels WTSF Performance Levels

  38. Findings: Set Perf. Levels Set Criterion-Referenced Cut Points Buoyancy depends on the density of the object relative to the density of the medium. Relative Density 1.4 Buoyancy depends on the density of the object. Density 0.4 Mass & Volume Buoyancy depends on the mass and volume of the object. -0. 6 Buoyancy depends on the mass or volume of the object. Mass or Volume -1.6 Buoyancy depends on the object being flat, hollow, filled with air, etc. Misconception -2.3 Does not attend to any property or feature to explain floating Unresponsive

  39. Findings: Map Performance Map Student Performance • Class performance status • Individual student change • Comparisons to curricular expectations • Item diagnostics

  40. Findings: Map Performance Class Performance Status After Lesson 7 Amy Tom Brian

  41. Findings: Map Performance Performance for Amy through Lesson 7 Lesson 4 Lesson 7 Pretest

  42. Findings: Map Performance Compared to Curricular Expectations

  43. Findings: Map Performance Item Diagnostics Diagnostic Map Student - Brian Variable - WTSF Item Set - RL7 Response LevelUnachieved Levels XXX

  44. Findings: Map Performance Item Diagnostics Diagnostic Map Student - Brian Variable - WTSF Item Set - RL7 Response LevelUnachieved Levels RL7b.MV RL7b.MorV XXX RL7b.Mis Performed as expected on this P-O-E item.

  45. Findings: Map Performance Item Diagnostics Diagnostic Map Student - Brian Variable - WTSF Item Set - RL7 Response LevelUnachieved Levels RL7d.MV XXXRL7d.MorV RL7d.Mis RL7d.UR Performed more poorly than expected on this P-O challenge item.

  46. Next Steps Next Steps • Procedure tested in FOSS curriculum Fall 2006 • Gathering teacher reactions to • Using standards-based graphs for formative feedback • Using software to record item codes • Consistency with intuition about students

  47. Questions & Answers Thank You! Cathleen Kennedy cakennedy@berkeley.edu

More Related