1 / 21

Assessing the impact of computer problem solving coaches: preliminary steps

This study examines the impact of computer problem solving coaches on students' performance and usage. It assesses students' baseline performance and their usage of the coaches. Preliminary findings suggest that the problem solving rubric effectively distinguishes students' skills, and students find the coaches useful.

rcamacho
Download Presentation

Assessing the impact of computer problem solving coaches: preliminary steps

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing the impact of computer problem solving coaches: preliminary steps Qing Xu, Ken Heller, Leon Hsu, Andrew Mason, Anne Loyle-Langholz University of Minnesota, Twin Cities AAPT Summer 2011 Meeting Omaha, NE Supported by NSF DUE #0230830 and DUE #0715615 and by the University of Minnesota

  2. Preliminary steps I .Characterizing how students perform without the coaches. (Baseline) II. Characterizing how do the students use coaches. (Usage)

  3. I. Baseline • Introductory mechanics at the University of Minnesota Spring 2011 • 4 quizzes, 2 problems per quiz • Apply the Problem-Solving Rubric to a representative sample (38 out of 108 students). - Assesses written solution along 5 dimensions Useful Description, Physics Approach, Specific Application of Physics, Mathematical Procedure, Logical Progression. • Two expert raters both scored 304 solutions. • 3 tiers (determined by exam scores) (tier 1 = high performing, tier 2 = medium performing, tier 3 = low performing)

  4. Qualitative differences between tiers

  5. Evolution of scores

  6. What’s next? • Increasing the sample size • E&M, Fall semester mechanics baseline

  7. II. Usage- Total time • Reasonable time for completion: 20~ 40 mins / module • Students tend to stay on task Only 2 of 18 had at least one break of more than 5 minutes for a question (data~fall2010)

  8. II. Usage- Automated Time

  9. II. Usage- Normalize to the shortest clicks

  10. II. Usage- Normalize to the shortest clicks

  11. Summary • The Problem solving rubric does distinguish students at different level of problem solving skills. • Scores achieved by students in all tiers remain constant across all categories as a function of time. • Students found the coaches useful and can treat them seriously. • http://groups.physics.umn.edu/physed • POSTER: PST2C56 Tue 08/02, 6:00PM - 6:45PM (Kiewit Fitness Center Courts)

  12. II. Usage-student preference • Student preference for each type • Faculty tend to disagree with students (found type 1 tedious) • 1 student initially preferred type 1 but switched to type 3 after gaining familiarity with physics

  13. Qualitative differences between tiers

  14. Qualitative differences between tiers

  15. Qualitative differences between tiers

  16. Qualitative differences between tiers

  17. Evolution of scores

  18. Evolution of scores

  19. Evolution of scores

  20. Evolution of scores

  21. II. Usage- Normalize to the shortest clicks • Distribution suggests students are taking the tutors seriously • Median time = 4.58 s

More Related