1 / 22

Combining Expert Judgement: A Review for Decision Makers

Combining Expert Judgement: A Review for Decision Makers. Simon French simon.french@mbs.ac.uk. Valencia 2: Group Consensus Probability Distributions. Group of decision makers. Decision Maker. Group of experts. Issues and undefined decisions. Experts.

Samuel
Download Presentation

Combining Expert Judgement: A Review for Decision Makers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Combining Expert Judgement:A Review for Decision Makers Simon French simon.french@mbs.ac.uk

  2. Valencia 2:Group Consensus Probability Distributions Group of decision makers Decision Maker Group of experts Issues and undefined decisions Experts

  3. Valencia 2:Group Consensus Probability Distributions Group of decision makers Decision Maker Group of experts Issues and undefined decisions Experts

  4. Different contexts  different assumptions appropriate • Expert Problem • Expert judgements are data to DM • OK to calibrate judgements • no assumption of equality • Many to 1 communication • Group Decision Problem • two step process: learn then vote • learn from each other  mutual communication • wrong to calibrate at decision? • equal voting power? • Text book Problem • Need to think of later unspecified decision • Need to communicate to unspecified audiences

  5. How do you question experts? If the non-swimmer averages advice on depths … he drowns! If he were to ask the question, ‘will I drown if I wade across?’ he would get a unanimous answer: yes!

  6. p( | x)  p(x | )  p() Posterior probability  likelihood  prior probability Approaches to the expert problem (1) Bayesian • Expert judgement is data • Difficulty in defining likelihood DM’s prior for quantities of interest in real problem

  7. Approaches to the expert problem (1) Bayesian • Expert judgement is data, x • Difficulty in defining likelihood p( | x)  p(x | )  p() Posterior probability  likelihood  prior probability DM’s probability for the experts’ judgementsgiven actual quantity of interestcorrelations? elicitation errors? calibration?

  8. Approaches to the expert problem (2) Opinion Pools • Expert judgement are taken as probabilities • Essentially a weighted mean • arithmetic, geometric, … • Weights defined from • DM’s judgement • Equal weights (Laplace, equal pay) • Social networks • Cooke’s Classical method • Weights defined from calibration data • Are there better scoring rules? • Many applications • Database of 45 studies • Computationally easy • Appears to discard poor assessors but actually finds spanning set

  9. Analysis Formulateissues and structure problem Decide andImplement But all this is the easy bit …. Expert advice on what might happen Expert input on models, parameters, probabilities • cf, discussions of EDA then confirmatory statistics • How do you elicit models andprobabilities? • Plausibility bias if it is the expert’s model?

  10. (p1(.), u1(.)), (p2(.), u2(.)), … (pi(.), ui(.)), …(pn(.), un(.)) (pg(.), ug(.))  ug(x) pg(x) dx Group decision problem Many approaches: • combine individual pi(.) and ui(.) into group pg(.) and ug(.) then form group expected utility ranking.

  11. (p1(.), u1(.)), (p2(.), u2(.)), … (pi(.), ui(.)), …(pn(.), un(.)) u1(x)p1(x)dx u2(x)p2(x)dx ui(x)pi(x)dx un(x)pn(x)dx vote vote vote vote Group decision problem Many approaches: • combine individual pi(.) and ui(.) into group pg(.) and ug(.) then form group expected utility ranking. • individuals rank using their own expected utility ordering then vote

  12. (p1(.), u1(.)), (p2(.), u2(.)), … (pi(.), ui(.)), …(pn(.), un(.))  ug(x) pg(x) dx Group decision problem Many approaches: • combine individual pi(.) and ui(.) into group pg(.) and ug(.) then form group expected utility ranking. • individuals rank using their own expected utility ordering then vote • altruistic Supra Decision Maker

  13. (p1(.), u1(.)), (p2(.), u2(.)), … (pi(.), ui(.)), …(pn(.), un(.)) (p1(x*), u1(x*)), (p2(x*), u2(x*)), … (pi(x*), ui(x*)), …(pn(x*), un(x*)) Group decision problem Many approaches: • combine individual pi(.) and ui(.) into group pg(.) and ug(.) then form group expected utility ranking. • individuals rank using their own expected utility ordering then vote • altruistic Supra Decision Maker • negotiation models

  14. Group decision problem Arrow’ Theorem and similar results  • combine individual pi(.) and ui(.) into group pg(.) and ug(.) then form group expected utility ranking. • individuals rank using their own expected utility ordering then vote • altruistic Supra Decision Maker • negotiation models Paradox and impossibility theorems abound in group decision making theory

  15. Group decision problem Arrow and similar results  • combine individual pi(.) and ui(.) into group pg(.) and ug(.) then form group expected utility ranking. • individuals rank using their own expected utility ordering then vote • altruistic Supra Decision Maker • negotiation models • social process which translates individual decisions into an implemented action • Decision conferences • Built around ‘reference’ decision or negotiation models • Decision analysis as much about communication as about supporting decision making • Might vote or might leave the actual decision to unspoken political/social processes

  16. Group decision support systems • The advent of the readily available computing means that algorithmic solutions to the Group Decision Problem are attractive. • Few software developers know any of the theory in this area, and ignorance of Arrow is rife.

  17. The textbook problem • How to present results to help in future as yet unspecified decisions • How does one report with that in mind? • Public participation and the web means that many stakeholders to issues are seeking and using expert reports … whether or not they understand them

  18. Cooke’s Principles for scientific reporting of expert judgement studies • Empirical control:Quantitative expert assessments are subjected to empirical quality controls. • Neutrality:The method for combining/evaluating expert opinion should encourage experts to state their true opinions, and must not bias results. • Fairness:Experts are not pre-judged, prior to processing the results of their assessments. • Scrutability/accountability: All data, including experts' names and assessments, and all processing tools are open to peer review and results must be reproducible by competent reviewers.

  19. Cooke’s Principles for scientific reporting of expert judgement studies • Empirical control:Quantitative expert assessments are subjected to empirical quality controls. • Neutrality:The method for combining/evaluating expert opinion should encourage experts to state their true opinions, and must not bias results. • Fairness:Experts are not pre-judged, prior to processing the results of their assessments. • Scrutability/accountability: All data, including experts' names and assessments, and all processing tools are open to peer review and results must be reproducible by competent reviewers. Few reports satisfy this : Chatham house reporting

  20. The Textbook Problem relates to … • Exploring issues, formulating decision problems, Developing prior distributions • So report should anticipate meta-analyses* and give calibration data, expert biographies, background information, etc. • Since the precise decision problem is not known at the time of the expert studies, the reports will be used to build the prior distributions not update them • Need meta-analytic approaches for expert judgement • Little peer-review • No publication bias • ‘self’ promotion of reports by pressure groups • Cooke’s principles not even considered.

  21. The textbook problem for public participation • Public and stakeholders will need to develop their priors from information available • But they will not always be sophisticated DMs nor will they be supported by an analyst • Behavioural issues • Probabilities versus frequencies (Gigerenzer) • Risk communication • celebrity • Observables versus parametric constructs

  22. Questions?

More Related