1 / 20

Choice-based assessments Dan Schwartz

Choice-based assessments Dan Schwartz. http://aaalab.stanford.edu. Assessment and Instruction. Tests often favor full-blown declarative and procedural knowledge. They miss the value of many other types of learning, and pull for the “mastery” of facts and steps.

deanne
Download Presentation

Choice-based assessments Dan Schwartz

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Choice-based assessments Dan Schwartz http://aaalab.stanford.edu

  2. Assessment and Instruction • Tests often favor full-blown declarative and procedural knowledge. • They miss the value of many other types of learning, and pull for the “mastery” of facts and steps. • Preparation-for-future-learning tests can lift out value of non-paradigmatic learning experiences. • Tests include opportunities to learn, and we measure whether prior learning experiences prepare people to learn from those opportunities.

  3. Experience and Explanation Dylan Arena Kidaptive.Com • Hidden value of videogames. Arena, D. A., & Schwartz, D. L. (2013). Experience and explanation: Using videogames to prepare students for formal instruction in statistics. Journal of Science Education and Technology.

  4. Interactive PFL • A limitation of PFL research has been the static nature of the learning resources in the PFL tests. • Lecture, worked example, text. • A good deal of learning comes through interaction. • Self-directed, interactive learning involves a series of choices. • Students who are prepared for future learning should make good choices about learning.

  5. Choice as the proper outcome of education. No time for a deeper discussion about normative goals of education and appropriateness of measuring choice. But see over here  Here, we just present an extended example for consideration. Free download at MIT Press

  6. Choicelets: Measure learning choices. Edit

  7. Posterlet Doris Chin Maria Cutumisu Kristen Blair 1) Choose Booth 2) Design Poster 3) Choose Focus Group Test Poster 7) Post Poster (see Ticket Sales) 4) Choose - or + Feedback No 5) Read Feedback Revise? Yes 6) Re-design Poster

  8. Feedback • Both positive (“I like…”) and negative (“I don’t like…) feedback are informative in the game. • Built a system that evaluates 21 graphic design elements. • Negative does not mean punishing. • Literature indicates that negative feedback better for learning, but risk of ego threat. • To my knowledge, nobody has examined the choice to seek negative feedback. • Central to many “design thinking” pedagogies.

  9. Research design • Managed to get about 450 children at two different “game-themed” schools • Quest schools in NYC and Chicago. • Children played posterlet for about 20 minutes. • We tracked: • Number of times they chose negative feedback (0-9) • Number of times they revised (0-3) • Quality of posters (based on 21 features) • Performance on a posttest of 21 graphic design principles.

  10. Step 1: Internal Validation • For 2+2, there is no argument that 4 is the correct answer. • With choices, the story is different. People can rightly argue, Who are you to say that choosing negative feedback is the right choice for learning. • To address, we examine correlations between choices and measures of learning or performance.

  11. Evidence on Internal Validation(choosing negative feedback leads to better learning) No differences on 1st poster, so not due to some sort of prior knowledge of graphic design principles.

  12. Step 2: External Validation • Does behavior in the game world say anything about “real” world learning? • Posterlet finding may not reach outside games. • Motivations for various choices in a game may not reflect motivations in the world.

  13. External Validation(We received standardized test scores for a subset of the students.) Nice to have a replication! It is missing in many data mining efforts, not to mention the very idea of external validation.

  14. Step 3: Experience Validation • Ultimately, we want to measure experiences not individuals. • Want to help improve instruction and flag kids who could use some attention. • To know if we can measure the effects of experience, we need to know if experience has an influence on the assessment.

  15. Experience Validation

  16. Summary • Choice to seek negative feedback correlated with better learning. • Is there another demonstration like this out there? • Seeking negative feedback seems to be a more stable correlate of learning than choosing to revise. • Like all assessments, if you do not have control over the experience, it is hard to know what experience you are measuring.

  17. Current Activities • Now that we can measure learning choices (so far), we want to show that it is possible to create experiences that influence choice behaviors. • Working to build brief design thinking curricula. • Complementary suite of design thinking choicelets. • Conduct an experiment where some get curricula, some not. • We’ll see. • In the meantime…

  18. Emerging Design Principles • PFL Principle • Must be something new to learn within the assessment. • Choice Principles • Learning must depend on making a choice. • Game can be completed without choosing to learn. (It is an unforced choice.) • Typical Performance Principle • Game should be designed to measure typical behavior not test behavior. • Validation Principles • Internal – show the choice does lead to learning in game • External – show the choice correlates with learning outside game. • Experimental – show experiences outside game can influence choices. • Data Abstraction Principles • Conceptualize game as having different “regions” of choice. • Aggregate regions of choices within rounds, and analyze rounds sequentially.

More Related