1 / 24

Heuristics and Biases

Heuristics and Biases. Tversky and Kahneman : Judgment under Uncertainty: Heuristics and Biases, Science 185, 1974. How do people asses the probability of uncertain event?.

vonda
Download Presentation

Heuristics and Biases

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Heuristics and Biases Tversky and Kahneman: Judgment under Uncertainty: Heuristics and Biases, Science 185, 1974

  2. How do people asses the probability of uncertain event? • People often use a limited number of heuristic principles which reduce the complexity of the task. In general these heuristics are quite useful, but they may lead to severe and systematic errors. • Resembles the subjective assessment of physical quantities, such as distance or size. Judgments based on data of limited validity. “The more sharply an object is seen, the closer it appears to be” ... over and under estimation

  3. Representativeness heuristic • Usually employed when people are asked to judge the probability that an object or event belongs to a certain class or originates from a certain process. • In answering such questions, people typically resort to the representativeness heuristic, in which probabilities are evaluated by the degree to which A is representative of B (resembles B). • When A is highly representative of B, the probability that A originates from B is judged to be high. On the other hand, if A is not similar to B, the probability that A originates from B is judged to be low.

  4. Representativeness heuristic continued • Insensitivity to prior probabilities (What is the likely occupation for Steve?) • Insensitivity to sample size (babies born is a small and large hospital) • Misconceptions of chance (Gambler’s fallacy, sequences that don’t appear random) • Misconceptions of regression • Regression towards the mean (tall men have shorter sons ...flight training: praise after an exceptionally smooth landing, punishment after a bad landing ... punishment works better!)

  5. Availability • ‘Availability of instances or scenarios heuristic’ is usually employed when people are asked to assess the frequency of a class or the plausibility of a particular development • Assess the probability of an event by the ease with which instances or occurrences can be brought to mind. • Mother working with disabled children • Airline accidents • Violent crime

  6. Availability heuristic • Biases due to the retrievability • Lists of men and women (in some more famous women; in some more famous men) ... easier to recall names of famous people • Biases due to the effectiveness of a search mechanism • Word starts with an r or its third letter is an r • Biases of imaginability • Committees of size 2 and 8 • Illusory correlation • Associative connections are strengthened when the events frequently co-occur

  7. Anchoring and adjustment • Adjustment from an anchor is usually employed in numerical predictions when a relevant value is available: initial value is adjusted to yield the final answer. • The initial value may be suggested by the formulation of the problem, or it may be the result of a partial computation. In either case, adjustments are typically insufficient. • 1x2x3x4x5x6x7x8 vs 8x7x6x5x4x3x2x1 • Country population

  8. Anchoring and adjustment • Biases in the evaluation of conjunctive and disjunctive events (compound events) • Disjunctive: a complex system will malfunction if one of its components malfunctions (underestimate the probability of a failure); P(malfunction) = 1-.99 times .99 times ... .99 (where .99 is probability of each component not malfunctioning) • Conjunctive: for an undertaking to succeed, each of a series of events must occur (overestimate the probability of success, because the success of each event is relatively high); P(success) = .99 times .99 times .99 ... times .99

  9. Anchoring and adjustment • Anchoring in the assessment of subjective probability distributions: • By collecting subjective probability distributions for many different (uncertain) quantities, it is possible to test the person’s calibration: the true values should fall below X1 1% of the time and above X99 1% of the time, etc. Experimental tests indicate large and systematic departures from proper calibration. Typically subjects state overly narrow confidence intervals. In other words, they think they know more than they actually do. On the average, they should widen their confidence intervals (i.e. lower their x1 and increase their x99)

  10. Anchoring and adjustment: assessment of subjective probabilities • Recall the two ways of eliciting a subjective probability distribution: • (1)  Ask for values (for fixed probabilities) • (2)  Ask for probabilities (for fixed values) • The two procedures are formally equivalent and should yield identical distributions. However, they suggest different ways of adjustment from different anchors: • (1)  The natural starting point is one’s best estimate of the quantity, adjustment from there • (2)  The subject may be anchored on the value provided in the question, adjustment from there

  11. Conclusions • Been concerned with cognitive biases stemming from the reliance on judgmental heuristics – not attributable to motivational effects • These heuristics are highly economical and often effective, but they lead to systematic and predictable errors • Experienced researchers (when they think intuitively) are prone to many of the same biases as laymen • Is it sufficient to be internally consistent in probability statements? • Learn from lifelong experience •  What can we do about the biases?

  12. Dual processes Kahneman: Maps of Bounded Rationality: Psychology for Behavioral Economics, The American Economic Review, Vol. 93, No. 5 (Dec., 2003), pp. 1449-1475

  13. Dual process theory • Intuition (System 1) thoughts seem to come spontaneously to mind, without conscious search or computation, and without effort • Examples of System1 • 2 + 2 = • Driving a car in a “easy situation” • Reasoning (System 2) is done deliberately and effortfully • Examples of System 2 • 38 x 252 = • Turning left with car • Most individual’s thoughts and actions are normally intuitive • What about business decisions/ or decisions in organizations?

  14. Dual process theory • System 1/ System 2 (Evans 2003 Trends in Cognitive Science, Kahneman 2003 The American Economic Review) • Implicit/ explicit processing (Monroe & Lee 1999 JAMS) • Unconscious/ conscious processing • System 1 may process information unconsciously or consciously • System 2 processes information consciously • Perception • The rules that govern intuition are generally similar to the rules of perception

  15. Consciousness • Unconscious behavior • Automatic with lack of awareness, intention and control • Inaccessible to conscious mind • Unconscious processing • Our perceptions, memory, and social judgments are all constructed by our unconscious mind • Limited data, employing context, expectations and even desires • Nature has created to make us survive • Subluminal messages • Can have some effect on behavior

  16. Perception and system 1 • Our social perception is not a direct result of what we experience, but rather is constructed by our minds, employing context, prior knowledge, and desire • Visual perception • The ease with which mental contents come to mind • Reference dependence • Mind fills in the missing parts of the information • Is also influenced by e.g. language, touch, motor processes • Mall survey • Soup advertisement

  17. Example of differential accessibility A B

  18. Reference dependence

  19. Our mind fills in the missing information

  20. Our mind fills in the missing information A girl learned to read at age of 4. What is her likely level of education? Social factors Environmental factors

  21. System 2 • Lazy by nature • Avoid difficult decisions (organ donations) • Information overload (jam tasting and purchases) • Limited capacity • Evaluation of cars (Science) • Reasoning may be motivated • Looks at conclusion that you want and limits information search on information that support the desired conclusion

  22. Cognitive systems Source: Kahneman 2003

  23. The boundaries of thinking • The judgments that people express, the actions they take and the mistakes they commit depend on • The monitoring and corrective function of System 2 • The impressions and tendencies generated by System 1 • System 1 has been associated with poor performance, but it can also be powerful and accurate • System 2 monitors judgments lightly, on some occasions it may however detect potential error and an effort will be made to correct it

  24. How can we improve thinking? • Thinking is impaired with: • Time pressure • Concurrent involvement in a different cognitive task • Thinking is influenced by: • Emotions • Mood • Context • Reasoning in positively correlated with: • Intelligence • Need for cognition (whether the individual enjoys thinking) • Exposure to statistical thinking

More Related