1 / 20

Teaching Research Ethics or Learning in Practice ? Preventing Fraud in Science

Teaching Research Ethics or Learning in Practice ? Preventing Fraud in Science. Dies Natalis Lecture ISS The Hague, 9 October 2014 Kees Schuyt , PHD, LL.M Sociology professor emeritus, University of Amsterdam; chair National Office of Research Integrity (2006-2015).

Download Presentation

Teaching Research Ethics or Learning in Practice ? Preventing Fraud in Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Teaching Research Ethics or Learning in Practice? PreventingFraud in Science Dies NatalisLecture ISS The Hague, 9 October 2014 Kees Schuyt, PHD, LL.M Sociology professor emeritus, University of Amsterdam; chair National Office of Research Integrity (2006-2015)

  2. Two phenomena, five topics • Scientificintegrity (whatit is andisn’t) • Data-management (goodand bad practices)

  3. Five topics: • What do we want to prevent? • Good and bad practices • Why does it happen? - Tentative explanations • What is to be done? - Rules or principles • Educating, learning, mentoring

  4. 1. What do we want to prevent ? • History of fraud in science (Baltimore-case (1986-1996) as turning point; US Office of Research Integrity, 1994 • Broad and Wade (1983); Van Kolfschooten (1996, 2012); Grant (2008) • Levelt - report on the Stapel-case (2011/2012) • What can we learn from incidents (outliers)? (teamwork; the system is not watertight: good datamanagement)

  5. Scientific integrity • Integrity is a self-chosen commitment to professional values (B. Williams 1973) • Resnik: “striving to follow the highest standards of evidence and reasoning in the quest to obtain knowledge and to avoid ignorance” (The Ethics of Science,1998) • Integrity is context bound, eg. fabulation in novels and fabulation in science; leading values in science (Merton 1949) • Codes of Conduct: NL 2005/2012; ESF 2010

  6. Violations Violations of the game rules of science: FFP : fabrication or fabulation falsification plagiarism Differencebetween F and P?

  7. 2. Good and bad practices • Questionable research practices (trimming, cooking, pimping, sloppiness, uncareful data management, not archiving) • Drawing the line (raw data, co-authorship, impolite behaviour)

  8. Trimming and cooking (Babbage 1830) • Trimming: “consists of clipping of little bits here and there from those observations which differ most in excess of the mean, and in sticking them on to those which are too small” • Cooking: “to give ordinary observations the appearance and character of those of the highest degree of accurance. One of its numerous processes is to make multitudes of observations, and out of these to select only those which agree, or very nearly agree”

  9. Metaphorically: “if a hundredobservations are made, the cook must beveryunluckyif he cannotpick out fifteen or twentywhichwill do forserving up” (Charles Babbage, Reflections on the decline of science in England and some of itscauses, 1830; 1989 editedbyHyman)

  10. Four main distinctions: • honest vs dishonest, fraudulent • good vs bad practices • controversies vs dishonest research • game rules vs goal rules

  11. Data-management The scientific research cycle: • 3 strong controlling points: grants, peer review, scientific community • 2 weak points: primaryprocess and data-archiving • Wide variationsbetween disciplines: is everything okay? • Bad to goodpractices: single vs teamwork • Scale of research: international data-gathering; protocols

  12. Variations in data and in data-gathering • Experimental design data (lab) • Stemcells, MRI-scan data • Mathematical data, logical analysis • Survey-data (pen and pencil) • Public data (time series, economic data, populations figures, official statistics) • Historical data (archives) • Anthropological field observation • Simulations

  13. 3. Whydoes it happen? • Three mainexplanations: • Publicationpressure: fromwho to whom? • Sloppyscience • Pressurefrom contract research • Alternativetentativeexplanatoryscheme: misplacedambition, loosementoring, ignoringearlysignals, poor peer review,no institutional response

  14. Contract research • What is the problem? Köbben 1995: scientific independence; pressure from above (yes, minister); conflicts of interests • Research biases? Biomedical research; Roozendaal • Patents, secrecy, firm’s data not public • Remedies: “good fences make good neighbours” (R.Frost), applied to contracts • Research codes, guidance committees, High Prestigious Research Group (hprg) • Conclusion: be a hprg: integrity high, high skills, independent

  15. 4. What is to be done? • Learn from best practices across disciplines • Peer pressure before peer review; data-manager and/or statistical counseling; open discussions to keep alert (not too often!) • Scientific pledge or oath taking!? • Lowering publication pressure? (causality!) • Educating ethics in science; integrated in data-management courses

  16. 5. Educating, learning, mentoring • The sixpack: a learning rules, discussing ethics b training research skills (eg. advanced statistics, philosophy of science) c good mentoring (becoming a good scientist) d oath-taking (!?) e online learning, the dilemma game f reading Being a scientist • Select your own best combination

  17. Gift to all PhD students:

  18. Thank you very much indeed for your attention

More Related