1 / 20

MAC: Logging Students’ Model-Based Learning and Inquiry in Science

MAC: Logging Students’ Model-Based Learning and Inquiry in Science. Principal & Co-Principal Investigators Paul Horwitz, Concord Consortium, Principal Investigator Janice Gobert, Concord Consortium, Co-PI & Research Director Bob Tinker, Concord Consortium, Co-PI

ginaperkins
Download Presentation

MAC: Logging Students’ Model-Based Learning and Inquiry in Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MAC: Logging Students’ Model-Based Learning and Inquiry in Science Principal & Co-Principal Investigators Paul Horwitz, Concord Consortium, Principal Investigator Janice Gobert, Concord Consortium, Co-PI & Research Director Bob Tinker, Concord Consortium, Co-PI Uri Wilensky, Northwestern University, Co-PI Other senior personnel Barbara Buckley, Concord Consortium Chris Dede, Harvard University Amie Mansfield, Concord Consortium Sharona Levy, Northwestern University Ken Bell, Concord Consortium Trudi Lord, Concord Consortium Intern: Nathaniel Putnam mac.concord.org; IERI #0115699 www.concord.org http://ccl.northwestern.edu

  2. Description of the project Developed and distributed model-based tools for scientific inquiry in: Genetics, Newtonian mechanics, Kinetic Molecular Theory, & Atomic structure Project involves 3 Partner schools, 11 Member schools, 298 Contributing schools, and 2043 demo schools. Our activities are scaffolded to support model-based reasoning and inquiry; we pose problems and monitor students as they solve them. All students’ interactions with models are logged. We analyze individual student logs by hand, then automate the process to examine the entire population. We use the log files to infer the students’ content learning and inquiry skill development and to characterize classroom implementations.

  3. Examples of Data as Indices of Inquiry Dynamica. Force and Motion activity yields indicators of students’ varying degrees of systematicity. BioLogica (Genetics) provides evidence for and against students’ deep, domain-specific reasoning and inquiry. Connected Chemistry (gas laws): investigation of different students’ strategic moves in inquiry learning.

  4. Collisions task: Student sets mass of two balls • The challenge: adjust the masses of the two balls to make the orange ball move as fast as possible after the collision.

  5. Strategies for Inquiry Four different inquiry strategies have been identified in two classes analyzed so far: haphazard opposite of target (light orange ball, heavy blue ball) Systematic (e.g., vary mass of one ball) correct on first trial Examples to follow…

  6. Correct on first trial (with test to make sure): Student 18185 got it right the first time: Blue Ball Orange ball 11.01.0 1.0 11.0 11.0 1.0

  7. Opposite of Target (incremental corrections) Student 16001 started off wrong, got it in four trials: Blue Ball Orange ball 2.011.0 2.0 5.0 9.0 1.0 11.0 1.0

  8. Haphazard Strategy Student 12116 made 15 trials: Blue Ball Orange ball 11.011.0 11.01.0 11.0 3.0 11.0 4.0 1.01.0 1.0 11.0 8.0 7.0 11.0 2.0 11.0 11.0 11.0 1.0 11.0 5.0 3.0 5.0 1.0 5.0 1.0 8.0 11.0 1.0

  9. Systematic Strategy (vary one ball at a time) Student 18115 had a plan: Blue Ball Orange ball 11.011.0 5.011.0 10.0 11.0 11.0 1.0

  10. BioLogica - Monohybrid Task 3:Produce only 2-legged offspring The student must: predict whether it is possible, describe the necessary conditions, change parents’ alleles to homozygous dominant and homozygous recessive, cross the dragons.

  11. Four Types of Performances • Student performances on this task can be categorized as follows: • successful on first try • successful after multiple systematic attempts • successful after multiple unsystematic attempts • unsuccessful Examples to follow…

  12. Successful on first attempt Fordham student 12182 nailed it: Correct prediction Concise plan Changed mother Changed father Crossed them Done!

  13. Success through systematic attempts Amarillo student 15041 Didn’t think it could be done Had a poor plan Ll x Ll Succeeded in 3 attempts LL x LL Ll x Ll ll x LL without repeating any crosses.

  14. Success through unsystematic attempts Fordham student 12230 Didn’t think it could be done Had no viable plan l Succeeded in 11 attempts LLxLL, LLxLL, LLxLL lLxLL lLxlL, lLxLl, LlxLl llxLl, llxLl LLxLl llxLL repeating numerous crosses.

  15. Failure through unsystematic attempts Fordham student 12200 Didn’t think it could be done Had no viable plan dsafsafsafsa Gave up after 9 attempts LLxLL, LLxLL, LLxLL, LLxLL, LLxLL, LlxLL, LlxLl, LlxLl Llxll repeating numerous crosses.

  16. Inquiry in Connected Chemistry Students asked to make the pressure monitor = zero. Possible solutions: having no particles or having very few particles in the box, so that the pressure sometimes reads zero.

  17. 3 Inquiry strategies employed

  18. Implementation data • We can learn a lot about a particular implementation by observing: • the number and order of activities completed • the average length of time and “engagement index” (nodes traversed per unit time) for each activity • the percentage of open-ended questions answered (ascertained through spell-checking) • the number and quality of teacher communiques • how often the teacher uses the reports

  19. So what does the computer“buy” us? • The researcher can: • identify typical patterns of student behavior and classroom implementation, and correlate them with learning results evaluated in other ways (e.g., pre-post test gains). • Compare students’ performance over time and across disciplines, using a very large population. • The teacher can: • identify individual students’ problems early, and intervene appropriately • stratify the class and divide it into smaller sections for treatment of particular topics • The administrator can: • compare implementations and results across teachers

  20. Conclusions • A new era in educational research… • Extremely fine-grained data • Very large experimental populations • … calls for a new specialty: datamining… • New tools • New metrics • … with important implications for assessment • Closer ties between assessment and learning • Performance assessment becomes possible

More Related