1 / 46

One or two things about Evidence-Based Management

One or two things about Evidence-Based Management. Pitstop Evidence-Based HR, VOV lerend netwerk , Gent , 27 september 2012. Evidence-Based Management?. VOV leden , n= 86. Evidence-Based Practice. 1991 Medicine 1998 Education 1999 Social care, public policy 2000 Nursing

zoey
Download Presentation

One or two things about Evidence-Based Management

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. One or two things about Evidence-Based Management PitstopEvidence-Based HR, VOV lerendnetwerk, Gent, 27 september 2012

  2. Evidence-Based Management? VOV leden, n= 86

  3. Evidence-Based Practice 1991Medicine 1998Education 1999Social care, public policy 2000Nursing 2000Criminal justice ????Management?

  4. Definition Evidence-based management means making decisions about the management of employees, teams or organizations through the conscientious, explicit and judicious use of four sources of information: 1. The best available scientific evidence 2. Organizational facts, metrics and characteristics 3. Stakeholders’ values and concerns 4. Practitioner expertise and judgment

  5. Four sources

  6. Four sources

  7. Trust me: 20 years of management experience!

  8. Bounded rationality

  9. Bounded rationality

  10. Het feilbarebrein • Seeing order in randomness • Mental corner cutting • Misinterpretation of incomplete data • Halo effect • False consensus effect • Group think • Self serving bias • Sunk cost fallacy • Cognitive dissonance reduction • Confirmation bias • Authority bias • Small numbers fallacy • In-group bias • Recall bias • Anchoring bias • Inaccurate covariation detection • Distortions due to plausibility

  11. if you’re hyperventilating breathe into a bag

  12. elderly people who have an irregular heartbeat are much more likely to die of coronary disease give them a drug that reduces the number of irregular beats

  13. How 40,000 cardiologists can be wrong In the early1980s newly introduced anti-arrhythmic drugs were found to behighly successful at suppressing arrhythmias. Notuntil a RCT was performed was it realized that, althoughthese drugs suppressed arrhythmias, they actually increasedmortality. The CAST trial revealedExcess mortality of 56/1000. By the time the results of this trial were published, at least100,000 such patients had been taking these drugs.

  14. Errors and Biases of Human Judgment • Doctorsand managers hold many erroneous beliefs, not because they are ignorant or stupid, but because they seem to be the most sensible conclusion consistent with their own professional experience! available evidence.

  15. Problem II: false information • Half of what you learn will be shown to be either dead wrong or out-of-date within 7 years of your graduation; the trouble is that nobody can tell you which half • Sackett: remember that your teachers are full of crap, just like your parents.

  16. Evidence-based? Incompetent people benefit more from feedback than highly competent people. Task conflict improves work group performance while relational conflict harms it. Being intelligent is a disadvantage for performing low skilled jobs.

  17. Evidence-based? • Competentie management • Excellentezorg • Kwaliteits management • Het nieuwewerken • Kennis management • Magnet, Investors in People • Plain Tree, Healing Environment • Balanced Score Card / INK • Lean / Six Sigma / TOC

  18. Don’t be a parrot! Think critically about experience, question your assumptions, and challenge what you think you know. (Show me the evidence!)

  19. 5-step approach • Formulate a focused question • Searching for the best available evidence • Critical appraisal • Turning evidence into practice • Monitor the outcome

  20. Step 3: Critical appraisal of studies

  21. Best available evidence?

  22. Research designs What is the BEST car?

  23. Research designs Which design for which question?

  24. Twee type vragen Effect vs Non-effect

  25. Type vraag: effect Werkt het? Werkt het beterdan ....? Heeft het een effect op ....? Watzijn de succesfactorenvoor ....? Wat is nodigom het telatenwerken? Effect

  26. Type: non-effect Needs: Wathebbenmensennodig, waarhebbenzebehoefteaan? Attitude:Watdenkenof vindenmensen van ...? Experience: Watzijn de ervaringen van mensen met ...? Prevalence:Hoeveelmensen / organisaties ....? Procedure: Hoe kunnen we .... implementeren? Explanation: Waaromwerkt het? Economics:Hoeveelkost het (tijd en geld)?

  27. Which design for which question? Explanation

  28. Best research design?

  29. Best available?

  30. What is evidence? • Evidence is not the same as ‘proof’ or ‘hard facts’ • Evidence can be • - so strong that no one doubts its correctness, or • - so weak that it is hardly convincing at all

  31. The best available evidence = Studies with the highest internal validity Studies with the highest external validity

  32. Internal validity internal validity= indicates to what extent the results of the research may be biased and is thus a comment on the degree to which alternative explanations for the outcome found are possible (confounding).

  33. Confounding Confounding is the idea that a 3rd variable can distort or confuse (or confound..) a relationship between two other variables. For instance, when factor X causes disease Y, that relationship could be confounded by factor C that is associated with both factor X and disease Y. C would be an alternative explanation for the relationship observed between X and Y.

  34. What are the confounders? Shoe size & quality of handwriting Body length & body weight Number of storks & birth rate Smoking youngsters & better lung function

  35. Reverse causation ? Charismatic leaders Successful companies

  36. Levels of internal validity Were there enough subjects in the study? Was a control group used? Were the subjects randomly assigned? Was a pretest used? Was the study started prior to the intervention or event? Was the outcome measured in an objective and reliable way? 6x yes = very high (A) 5x yes = high (A) 4-3x yes = limited (B) 2x yes = low (C) 1-0x yes = very low (D)

  37. Levels of internal validity

  38. Levels of internal validity It is shown that … It is likely that … There are signs that … Experts are of the opinion that …

  39. Best available evidence: external validity

  40. External validity: generalizability Always ask yourself to what extent the evidence is generalizable to your situation: Ecological validity: Is your organizationso different from those in the study that its results may be difficult to apply? Population validity: Is your population so different from those in the study that its results may be difficult to apply?

  41. Generalizability Keep in mind: What works in one narrowly defined setting might not work in another, but some psychological principles are generalizable to all human beings.

  42. Step 4: Turning evidence into practice

  43. Feasible? • organizational facts and characteristics • cultural aspects • stakeholders’ values and concerns • political aspects • financial aspects /cost-effectiveness / ROI • priorities • change readiness / resistance to change • implementation capacity • timing

  44. Tot slot: Focus on the decision making process! (not the outcome)

More Related