1 / 16

Adventures in CPS

Adventures in CPS. Aaron Bruck Towns Group Meeting September 25, 2007. Goals for the summer. Complete CPS projects 2 different projects Quantification and categorization of CPS questions Linking CPS results to results on exams Look into new ideas for research (and possibly an OP)

Download Presentation

Adventures in CPS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adventures in CPS Aaron Bruck Towns Group Meeting September 25, 2007

  2. Goals for the summer • Complete CPS projects • 2 different projects • Quantification and categorization of CPS questions • Linking CPS results to results on exams • Look into new ideas for research (and possibly an OP) • Scientific Literacy • Assessment tools

  3. What is CPS? • Students use “clickers” to respond to instructor-generated questions • These responses are stored in an online database • Students received grades based on the number of questions answered correctly

  4. Categorization • We decided to categorize the questions in the following ways: • Solo vs. Buddy • Definition vs. Algorithmic vs. Conceptual • Using Bloom’s Taxonomy • Using Smith/Nakhleh/Bretz Framework1 • We also compared our analyses with those of Mazur2 to make sure we were looking for the right things. 1Smith, K. C., Nakhleh, M. B., Bretz, S. L. An Expanded Framework for Analyzing General Chemistry Exams. Journal of Chemical Education. In press. 2Fagen, A. P., Crouch, C. H., Mazur, E. (2002) Peer Instruction: Results from a Range of Classrooms. The Physics Teacher. 40, 206-209.

  5. Categorization, cont. • Here are the results from one of the sections (others followed a similar trend):

  6. More categorization

  7. Results by Category • A 2 tailed t-test (solo/buddy) and one way ANOVAs (all others) were performed to test for statistical differences in the data • Analyses showed no significant differences between any of the categories and how the students performed on the questions • The only exception were the solo-buddy questions for one professor

  8. Solo/Buddy Analysis • Prompted by the unusual results, we further investigated the solo/buddy analysis • We also looked at pairs of solo/buddy questions asked one after the other: T-test results: t=-2.699 p=0.017 (significant difference)

  9. That’s great, but… • We found a significant difference between solo and buddy questions…but is it worth anything? • Our next step was to see if this apparent difference in performance due to style of question translated into better test scores on the exams.

  10. Exam Analysis • We compared exam questions with questions asked in class using CPS. • Surprisingly, we found very few questions on the exams that directly or indirectly corresponded to CPS questions. • Each exam was analyzed individually before pooling all of the data to determine any and all effects.

  11. Exam Analysis All analyses showed no significant differences at the p=0.05 confidence level.

  12. Instructor Effects • We also ran an analysis to check for any instructor effects that could have possibly skewed the data. • Results showed no significant differences at the p=0.05 level:

  13. Is CPS better than nothing? • A final analysis was performed between questions that correlated to CPS questions and those that did not. • Unfortunately, no significant differences were found, though the average score was higher for CPS questions.

  14. CPS vs. Nothing Results DescriptiveStatistics: Results of ANOVA:

  15. Conclusions • CPS is an effective lecture tool that engages students interactively in their content • Most CPS questions are low-level questions in terms of Bloom’s Taxonomy and other categorization tools • Students seem to learn content through interaction with their peers when using CPS, though this does not necessarily correlate to success on exams

  16. What else did I do? • Research Questions • In the event that I need to do a project other than the NSDL project, what avenues are available? • Could any of these ideas turn into a possible OP in the following months? • Ideas of interest • Scientific Literacy • What is the value of a textbook? • Could other materials help? • Assessment • Immediate feedback assessment technique (IFAT) • Could it work in chemistry?

More Related