1 / 28

Ethical Decision Making: Heuristics and Biases William J. Wilhelm College of Business Indiana State University

Used by permission. Ethical Decision Making: Heuristics and Biases William J. Wilhelm College of Business Indiana State University. The Four Components of Moral Behavior (Rest et al, 1999). Moral sensitivity Moral judgment Moral motivation Moral character. Steps in making a judgment.

elle
Download Presentation

Ethical Decision Making: Heuristics and Biases William J. Wilhelm College of Business Indiana State University

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Used by permission Ethical Decision Making: Heuristics and Biases William J. Wilhelm College of Business Indiana State University

  2. The Four Components of Moral Behavior(Rest et al, 1999) • Moral sensitivity • Moral judgment • Moral motivation • Moral character

  3. Steps in making a judgment Problem recognition Identification of alternative courses of action Evaluation of alternative courses of action Estimation of outcome probabilities Calculation of expected values Justification of course of action chosen

  4. BUSINESS Evaluation Tools. For example, in management decisions we use tools such as: • cost-benefit analysis • feasibility analysis • time-to-market analysis • net present value • strategic prioritization • etc.

  5. ETHICAL Evaluation Tools • Conventional moral rules and codes • Universal duty towards others • Greatest good for the greatest number • Characteristics of a good person The Golden Rule, laws, corporate codes of ethics, etc. Kant’s categorical imperative Bentham & Mill’s utilitarianism Aristotle’s virtue theory: bravery, honesty, temperance, generosity, justice, pride.

  6. Conventional rules and laws • Categorical imperative • Utilitarianism • Virtue theory Steps in making a judgment Problem recognition Identification of alternative courses of action Evaluation of alternative courses of action Estimation of outcome probabilities Calculation of expected values Justification of course of action chosen

  7. Steps in making a judgment Biases and other influences on perceptions and decision making (heuristics) Problem recognition Identification of alternative courses of action Evaluation of alternative courses of action Estimation of outcome probabilities Calculation of expected values Justification of course of action chosen

  8. Rational Actors? Optimal Decision-Making Model? • People are plagued more by bad decision making than ethical breaches in reasoning. • Cognitive and behavioral susceptibilities might lead (often unwittingly) to unethical decision making. • Overwhelming evidence that people do not always make decisions in a rationally optimal manner (Kahneman & Tversky, 2000). • Various heuristics and biases lead most people to systematically diverge from optimal decision-making.

  9. Conflicting values • Individual • Social • Religious • Organizational • Cultural • Other

  10. Obedience to authority Social proof False consensus effect Over optimism Overconfidence Self-serving bias Framing Process Cognitive dissonance Sunk costs The tangible and the abstract Time-delay traps Loss aversion From: Teaching ethics, heuristics, and biases. Robert Prentice (2004) Journal of Business Ethics Education, 1(1), 57 – 74. Biases and heuristics that can cloud ethical decision making

  11. Obedience to Authority • "Just following orders" ("Good Nazi" defense) • Stanley Milgram (1963) experiments. • Students need to be aware of this potentially corrosive influence from both formal lines of authority and non-formal authority.

  12. Social Proof • "Everyone else is doing it” • Pressure to conform with others in the group of co-employees and/or friends. • Many behaviors are caused by external influences rather than their own disposition. • Obscenely-high executive salaries? • Options backdating • Insider trading

  13. False Consensus Effect • Thinking that other people think the same way that we do. • Reinforces inclinations to follow authority and submit to peer pressure. • Honest people will tend to believe that those they interact with are honest as well. • Employees may get involved in some wrongdoing themselves but may not fully recognize the ethical implications of their acts.

  14. Over-optimism • Humans are often overly optimistic about OUTCOMES. • Often leads to irrational beliefs. • Divorce rate at 50% -- newlyweds tend to rate their own chances of divorce at 0%. • Basis for unethical decisions: corporate disclosure fraud cases could be the result of irrationally optimistic views of a firm’s conditions and prospects.

  15. Overconfidence • People are often irrationally overconfident • Deals with perceptions about INDIVIDUAL CAPACITIES. • People tend to rate themselves as well above average in most traits, including honesty. • Business people tend to believe that they are more ethical than their competitors. • Overconfidence in one's own ethical compass can lead people to accept their own decisions without serious reflection.

  16. Self-Serving Bias • The belief in deserved rewards for one's self. • Affects (unconsciously) information that people seek out to confirm rather than disconfirm evidence. • Affects how people remember information. • Affects judgments of fairness.

  17. Self-Serving Bias – con’t. • Confirmation bias – searching for information that supports a conclusion and ignoring information that disconfirms it. • Belief persistence – people tend to persist in beliefs they hold long after the basis for those beliefs is substantially discredited. • Causal attribution theory – people tend to attribute to themselves more than average credit for their company’s successes (and less for failures)

  18. Framing • People's risk preferences change with context - depending on whether an option is framed in terms of potential loss or potential gain. • The self-serving bias may lead an actor to frame decisions in such a way as to lead to ethically questionable conclusions. • Example: Maximizing (shareholder) value versus stakeholder interests

  19. Process • People sometimes make much different decisions depending upon whether they are presented with a particular big decision, or a series of incremental decisions leading to the same point. • Slide down a slippery slope incrementally • Example: Looking the other way during another’s errant behavior, then covering up for another, then participating, then conspiring.

  20. Cognitive Dissonance • Uncomfortable psychological inconsistency caused by incompatibility between two conflicting beliefs or attitudes • Once people have made decisions or taken positions, they will cognitively screen out or reject information which undermines their decisions or contradicts their positions.

  21. Sunk Costs • People tend to stick by decisions into which they have sunk significant costs. • Sunk costs can lead to an escalating commitment. • New product development examples • Individual job investment – job, salary, perquisites are not easily parted with.

  22. The Tangible and the Abstract • Decision-making is impacted more by vivid, tangible, contemporaneous factors • Less by factors that are removed in time and space. • Designers and marketers of new products with safety problems

  23. Time-Delay Traps • When an action has both short-term and long-term consequences, the former (short-term) are much easier for people to consider. • People subject to this time-delay trap in decision-making often prefer immediate to delayed gratification.

  24. Loss Aversion • People detest losses more than they enjoy gains, about twice as much. • Endowment effect - the notion that we easily attach ourselves to things and then value them much more than we valued them before we identified with them. • People will make decisions in order to protect their endowment that they would never have made in the first place to accumulate that endowment.

  25. Limitations: • Evidence shows that some of these tendencies are very difficult to debias, even with experience and training. • Nonetheless, not all attempts to debias have been failures. • Common sense dictates educating students and employees about these biases and heuristics.

  26. Why teach about heuristics and biases? • Sensitize employees to various forms of ethical dilemmas. • Educate employees regarding their own cognitive and behavioral susceptibilities • Educate employees about potential non-formal organizational influences and pressures • Inoculate employees against weaknesses in their own decision-making processes. • Largely ignored in business school and law school classrooms in subjects of professional ethics.

More Related