introduction n.
Skip this Video
Loading SlideShow in 5 Seconds..
Introduction PowerPoint Presentation
Download Presentation


114 Views Download Presentation
Download Presentation


- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Key references are Barberis and Thaler (2003) Gärling et al. (2009) Li Calzi (2008) Plous (1993) Rabin (1996) (and later works by the same author) Shefrin (2007) Shleifer (2000) Introduction Bibliography 1 Friday, 06 June 20147:46 AM

  2. For a discursive review see Risk off - Economist - Jan 25th 2014 . For an interesting essay on the history of finance in five crises, Economist - 12th April 2014 They have a common theme: in each case the state increased the subsidies and guarantees it gave to finance — and helped set up the next crisis. The cover leader, Economist - 12th April 2014, points out that this is happening again. An American can now blindly put $250,000 in a bank, knowing his deposit is insured by the state. Finance, we say, should be treated more like other industries. Introduction 2

  3. Introduction 3

  4. Making decisions is both tough and risky (see, for example, the reviews by Rapoport and Wallsten 1972 and Edwards and Fasolo 2001). Bad decisions can cause damage to a business a career or your finances, sometimes irreparably. So where do bad decisions come from? Introduction 4

  5. In many cases, they can be traced back to the way the decisions were made; the alternatives were not clearly defined, the right information was not collected, the costs and benefits were not accurately weighed. Sometimes the fault lies not in the decision-making process but rather in the mind of the decision maker. The way the human brain works can sabotage our decisions. Researchers have identified a whole series of such flaws in the way we think in making decisions. Introduction 5

  6. Shefrin’s (2010) insightful observation is of interest: “Finance is in the midst of a paradigm shift, from a neoclassical based framework to a psychologically based framework. Behavioural finance is the application of psychology to financial decision making and financial markets. Behaviouralising finance is the process of replacing neoclassical assumptions with behavioural counterparts. … the future of finance will combine realistic assumptions from behavioural finance and rigorous analysis from neoclassical finance.” Introduction 6

  7. If irrational traders cause deviations from a “true” value, rational traders will often be powerless to do anything about it (Barberis and Thaler 2003)? Current examples are the oil price (which is too high) and the gold or silver prices (where there has been artificial shorting going on). In view of this economists turn to the extensive experimental evidence compiled by cognitive psychologists on the systematic biases that arise when people form beliefs, and on people’s preferences. Introduction 7

  8. For a useful link to topics in Daniel Kahneman’s text see Thinking, Fast and Slow | Shim Marom. This provides an overview rather than a detailed analysis. Also of interest is Common Flaws With How We Think - Forbes 2/11/2014 Ross Pomeroy. These are intended as background reading. Introduction 8

  9. Beliefs 9 1.9

  10. Beliefs 10 1.10

  11. 1.11 11

  12. Beliefs 12 1.12

  13. Extensive evidence shows that people are overconfident in their judgments. • Confidence intervals people assign to their estimates of quantities. For example, estimating the level of the stock market in a years time, are far too narrow. Their 98% confidence intervals, for example, include the true quantity only about 60% of the time (Alpert and Raifa, 1982). Beliefs - Overconfidence 13

  14. Beliefs - Overconfidence Extensive evidence shows that people are overconfident in their judgments. 2 People are poorly calibrated when estimating probabilities. Events they think are certain to occur actually occur only around 80% of the time, and events they deem impossible occur approximately 20% of the time (Fischhof et al. 1977). 1.14 14

  15. Beliefs - Overconfidence An example from David J. Spiegelhalter and co-workers. The Great Ormond Street Hospital in London (GOS) specialises in child diseases and acts as a regional centre for South East of England. Whenever a blue baby is born, the paediatrician telephones GOS and a diagnosis is made. It is then decided whether or not to send Child to GOS for treatment. A Bayesian model has been used for the diagnosis of congenital heart disease and was derived by experts. 1.15 15

  16. Beliefs - Overconfidence Evaluation Of A Diagnostic Algorithm For Heart-Disease In Neonates Franklin, R.C.G., Spiegelhalter, D.J., Macartney, F.J. and Bull, K. British Medical Journal, 302, 935-939, 1991. 1.16 16

  17. Beliefs - Overconfidence 17 1.17

  18. Beliefs - Overconfidence On the first day a baby exhibited a mix of symptoms that the experts said would never arise together. The impossible occurred! 1.18 18

  19. Overconfidence may in part stem from two other biases (self-attribution and hindsight bias). 3 Self-attribution bias refers to people’s tendency to ascribe any success they have in some activity to their own talents, while blaming failure on bad luck, rather than on their ineptitude. Doing this repeatedly will lead people to the pleasing but erroneous conclusion that they are very talented. Beliefs - Overconfidence 19

  20. We like to exploit the luck of others. Psychologists have documented the many irrational ways we think about luck, from the fact we prefer to make our own choice in gambling games (thus increasing our sense of control) to our belief in lucky runs or hot numbers (Wohl and Enzle, 2009). Beliefs - Overconfidence For more details 20

  21. Bad luck really can be reversed by touching wood ritual (The Telegraph 2 Oct. 2013).In five separate experiments, researchers had participants either tempt fate or not and then engage in an action that was either avoidant or not. The avoidant actions included those that were superstitious – like knocking on wood – or non-superstitious – like throwing a ball. They found that those who knocked down (away from themselves) or threw a ball believed that a jinxed negative outcome was less likely than participants who knocked up (toward themselves) or held a ball (Zhang et al. 2013). Beliefs - Overconfidence 1.21 21

  22. For example, investors might become overconfident after several quarters of investing success (Gervais and Odean, 2001). In an experimental asset market where agents trade one risky asset, Maciejovsky and Kirchler (2002) find the largest overconfidence towards the end of the experiment, when the participants gain more experience and start to rely more heavily on their (overestimated) knowledge. This finding indicates that overconfidence may be subject to modifications, which goes back to the crucial role of clear, rapid feedback in shaping individual overconfidence levels (Russo and Schoemaker 1992). Beliefs - Overconfidence 22

  23. Overconfidence may in part stem from two other biases (self-attribution and hindsight bias). 1 Hindsight bias is the tendency of people to believe, after an event has occurred, that they predicted it before it happened. 2 If people think they predicted the past better than they actually did, they may also believe that they can predict the future better than they actually can. Beliefs - Overconfidence 23

  24. Overconfidence is the tendency to be overly optimistic, to overestimate one's own abilities, or to believe their information is more precise than it really is. In the strip, Dilbert's boss falls victim to this bias when he assumes that all managers (presumably including himself) are better than average, all the while not recognizing Dilbert's impolite jab at his poor math skills Cartoon(Kramer 2014). For a broad interdisciplinary review see Skala (2008). Beliefs - Overconfidence 24

  25. 1.25 25

  26. Beliefs 26 1.26

  27. To reduce the effects of overconfidence in making estimates, always start by considering the extremes, the low and high ends of the possible range of values. This will help you avoid being anchored by an initial estimate. Then challenge your estimates of the extremes. Try to imagine circumstances where the actual figure would fall below your low or above your high, and adjust your range accordingly. Challenge the estimates of your subordinates and advisers in a similar fashion. They're also susceptible to overconfidence (Hammond et al., 1998/2006). Overconfidence - Avoidance 27

  28. 1.28 28

  29. Beliefs 29 1.29

  30. Another problem takes the form of over-cautiousness, or prudence. When faced with high-stakes decisions, we tend to adjust our estimates or forecasts “just to be on the safe side.” Beliefs - Prudence 30

  31. Many years ago, for example, one of the Big Three U.S. automakers was deciding how many of a new-model car to produce in anticipation of its busiest sales season. The market-planning department, responsible for the decision, asked other departments to supply forecasts of key variables such as anticipated sales, dealer inventories, competitor actions, and costs. Beliefs - Prudence 31

  32. Knowing the purpose of the estimates, each department slanted its forecast to favour building more cars; “just to be safe.” But the market planners took the numbers at face value and then made their own “just to be safe” adjustments. Not surprisingly, the number of cars produced far exceeded demand, and the company took six months to sell off the surplus, resorting in the end to promotional pricing (Hammond et al. 1998/2006). Beliefs - Prudence 32

  33. Policy makers have gone so far as to codify over-cautiousness in formal decision procedures. An extreme example is the methodology of “worst-case analysis,” which was once popular in the design of weapons systems and is still used in certain engineering and regulatory settings. Using this approach, engineers designed weapons to operate under the worst possible combination of circumstances, even though the odds of those circumstances actually coming to pass were infinitesimal. Beliefs - Prudence 33

  34. Worst-case analysis added enormous costs with no practical benefit (in fact, it often backfired by touching off an arms race), proving that too much prudence can sometimes be as dangerous as too little. Beliefs - Prudence 34

  35. However, maybe we can be more careful, list of bridge failures famously the Tacoma Narrows Bridge (Galloping Gertie 7 November 1940) The Tay Bridge disaster occurred during a violent storm on 28 December 1879 when the first Tay Rail Bridge collapsed while a train was passing over it from Wormit to Dundee, killing all aboard. For William McGonagall's poem on this subject, see The Tay Bridge Disaster. Beliefs - Prudence 35

  36. 36

  37. Beliefs 37 1.37

  38. To avoid the prudence trap, always state your estimates honestly and explain to anyone who will be using them that they have not been adjusted. Emphasize the need for honest input to anyone who will be supplying you with estimates. Test estimates over a reasonable range to assess their impact. Take a second look at the more sensitive estimates (Hammond et al. 1998/2006). Prudence - Avoidance 38

  39. 1.39 39

  40. Beliefs 40 1.40

  41. Even if we are neither overly confident nor unduly prudent, we can still fall into a trap when making estimates or forecasts. Because we frequently base our predictions about future events on our memory of past events, we can be overly influenced by dramatic events; those that leave a strong impression on our memory. Beliefs - Recallability 41

  42. We all, for example, exaggerate the probability of rare but catastrophic occurrences such as plane crashes because they get disproportionate attention in the media. A dramatic or traumatic event in your own life can also distort your thinking. You will assign a higher probability to traffic accidents if you have passed one on the way to work. You will assign a higher chance of someday dying of cancer yourself if a close friend has died of the disease. Beliefs - Recallability 42

  43. In fact, anything that distorts your ability to recall events in a balanced way will distort your probability assessments. In one experiment, lists of well-known men and women were read to different groups of people (Hammond et al. 1998/2006). Beliefs - Recallability 43

  44. Unbeknownst to the subjects, each list had an equal number of men and women, but on some lists the men were more famous than the women while on others the women were more famous. Afterward, the participants were asked to estimate the percentages of men and women on each list. Those who had heard the list with the more famous men thought there were more men on the list, while those who had heard the one with the more famous women thought there were more women. Beliefs - Recallability 44

  45. Corporate lawyers often get caught in the recallability trap when defending liability suits. Their decisions about whether to settle a claim or take it to court usually hinge on their assessments of the possible outcomes of a trial. Because the media tend to aggressively publicise massive damage awards (while ignoring other, far more common trial outcomes), lawyers can overestimate the probability of a large award for the plaintiff. As a result, they offer larger settlements than are actually warranted (Hammond et al. 1998/2006). Beliefs - Recallability 45

  46. 1.46 46

  47. Beliefs 47 1.47

  48. To minimize the distortion caused by variations in recallability, carefully examine all your assumptions to ensure they're not unduly influenced by your memory. Get actual statistics whenever possible. Try not to be guided by impressions (Hammond et al. 1998/2006) Recallability - Avoidance 48

  49. 1.49 49

  50. Beliefs 50 1.50