1 / 20

Comparative effectiveness of research ethics teaching methods

Comparative effectiveness of research ethics teaching methods. Michael Kalichman and Dena Plemmons UC San Diego Research on Research Integrity Annual Meeting Niagara Falls, NY May 16, 2009. Goal. Assess effectiveness of teaching research ethics Challenges Different teaching objectives

loring
Download Presentation

Comparative effectiveness of research ethics teaching methods

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Comparative effectiveness of research ethics teaching methods Michael Kalichman and Dena Plemmons UC San Diego Research on Research Integrity Annual Meeting Niagara Falls, NY May 16, 2009

  2. Goal • Assess effectiveness of teaching research ethics • Challenges • Different teaching objectives • Different institutions and audiences • Different instructors • Assess relative effectiveness of different methods for teaching research ethics

  3. Approach: The Course • The course: Scientific Ethics (UC San Diego) • 10 sessions, 1-1/2 hrs per week • 3 sections each week • 2 instructors (DP, MK) • Teaching Methods: • Week 1: Lecture • Weeks 2-10 • Lecture + Small group case discussion • Lecture + Role play • Case-based Lecture

  4. Approach: The Students • Students • Graduate Students, some postdocs • Biology, Neurosciences,Other (e.g., Bioengineering) • UC San Diego,Other (Salk, The Scripps Research Institute) • Number • Total = 57 • 18 – 20 assigned per section

  5. Approach: Randomization L = Lecture + small group case discussion R = Lecture + role-play C = Case-based lecture

  6. Approach: Outcomes • Pre-Test: 18 multiple choice questions (2 for each of 9 topics) • Weekly Quiz: • 5 multiple choice questions • 3 attitude questions about the topic • 2 attitude questions about the survey (enjoyable? useful?) • Post-Test: Same as Pre-Test • Final Evaluation: Satisfaction, Perspectives

  7. Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course? • “not at all” • “not much”

  8. Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course? • “great increase in knowledge” • “definitely more aware and now know options available” • “the course stressed topics that are easy to ignore, …useful for recognizing similar situations before it's too late”

  9. Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course? • “…I will be able to more effectively deal with many types of situations” • “learned new strategies for dealing with complex ethical issues” • “I am now better prepared to deal with problems like the ones discussed in class”

  10. Overall Impact of Course: Perceptions How if at all have your awareness, knowledge, skills, or attitudes been changed by participating in this course? • “It has helped me to see other perspectives and change my mind on several topics.” • “it increase[d] my moral outrage”

  11. Sample Question #1 Research data are the property of: • The institution which employs those who are collecting the data. • The person(s) who collect the data. • The head of the research group that collects the data.

  12. Sample Question #2 The primary basis for requirements for review of human subjects research is in regulations created by the • State government • Federal government • University

  13. Overall Impact of Course: Knowledge • Knowledge improved (P<0.001) • However, improvement was modest • Pre-test median = 12.0 • Post-test median = 13.0

  14. Effects of Methods • Knowledge:No statistically significant difference among methods for any of the 9 test weeks. • Attitudes:No statistically significant difference among methods for any of the 9 test weeks.

  15. Perceptions of Methods • Useful or Enjoyable? • No statistically significant difference among methods for 5 of the 9 test weeks. • In 4 of the 9 test weeks, Lecture+Roleplay judged to be more enjoyable than Case-Based Lecture and/or Lecture+Case Discussion • In 3 of the 9 test weeks, Lecture+Roleplay was judged to be more effective than Case-Based Lecture and/or Lecture+Case Discussion • However, when different methods compared across all weeks (2-factor ANOVA), no effect of method.

  16. Student Preference - Methods • Which of the methods did you find to be mostuseful for meeting the goals of this course?

  17. Student Preference - Methods • Which of the methods did you find to be most enjoyable for meeting the goals of this course?

  18. Student Preference - Methods • In future courses, would you recommend using:

  19. Student Preference - Comments “I really disliked the role-playing.I didn’t think it was beneficial at all.” 13 of 51 respondents (>25%) specifically commented on not liking the role-play exercises

  20. Summary and Conclusions Summary • Student perceptions: positive impact on knowledge, skills, and attitudes • Knowledge improved, but is it worth the cost? • No difference in methods for knowledge or attitudes • Lecture+roleplay considered more enjoyable and/or useful during several weeks of course, but least liked overall at end of course • Mixed methods preferred Conclusions • Impact of course on attitudes needs to be assessed • Teachers may be more important than methods

More Related