1 / 29

The Doxastic engineering of acceptance rules

The Doxastic engineering of acceptance rules. { Kevin T. Kelly , Hanti Lin } Carnegie Mellon University. This work was supported by a generous grant by the Templeton Foundation. Two Models of Belief. Propositions. B. A. C. Two Models of Belief. Propositions. Probabilities.

neviah
Download Presentation

The Doxastic engineering of acceptance rules

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Doxastic engineering of acceptance rules { Kevin T. Kelly ,Hanti Lin } Carnegie Mellon University This work was supported by a generous grant by the Templeton Foundation.

  2. Two Models of Belief Propositions B A C

  3. Two Models of Belief Propositions Probabilities (0, 1, 0) (1/3, 1/3, 1/3) B A C (0, 0, 1) (1, 0, 0)

  4. How Do They Relate? Propositions Probabilities (0, 1, 0) ? (1/3, 1/3, 1/3) B A C (0, 0, 1) (1, 0, 0)

  5. Acceptance Propositions Probabilities (0, 1, 0) Acpt (1/3, 1/3, 1/3) B A C (0, 0, 1) (1, 0, 0)

  6. Acceptance as Inference • You condition on whatever you accept (Kyburg, Levi, etc.) • Very seriousbusiness! • Does it everhappen? • “It’s not Sunday, so let’s buy beer at the super market”. • You would never bet your life against nothingthat what you say to yourself in routine planning are true.

  7. Acceptance as Apt Representation • The geometry of probabilities is much richerthan the lattice of propositions. • Aim: represent Bayesian credences as aptly as possible with propositions.

  8. Acceptance as Apt Representation • The geometry of probabilities is much richerthan the lattice of propositions. • Aim: represent Bayesian credences as aptly as possible with propositions. B Acpt B A C

  9. Acceptance as Apt Representation • The geometry of probabilities is much richerthan the lattice of propositions. • Aim: represent Bayesian credences as aptly as possible with propositions. Acpt B B v C C A

  10. Familiar Puzzle for Acceptance • Suppose you accept propositions more probable than 1/2. • Consider a 3ticket lottery. • For each ticket, you accept that it loses. • That entailsthat everyticket loses. (Kyburg) -A -B -C 1/2

  11. Thresholds as Logic • High probability is like truth value 1. 1 B A C

  12. Thresholds as Geometry (0, 1, 0) (1/3, 1/3, 1/3) (1, 0, 0) (0, 0, 1)

  13. Thresholds as Geometry A p(A) = 0.8 B C

  14. Thresholds as Geometry A p(A v B) = 0.8 A v B B C

  15. Thresholds as Geometry A A v C A v B B C

  16. Thresholds as Geometry A A Closure under conjunction A v C A v B B C

  17. Thresholds as Geometry A A v C A v B B B v C C

  18. Thresholds as Geometry A A v C A v B T B B v C C

  19. The Lottery Paradox A t A v C A v B 2/3 T B B v C C

  20. The Lottery Paradox A t A v C A v B 2/3 T B B v C C

  21. The Lottery Paradox A A v C A v B 2/3 T t B B v C C

  22. Two Solutions CMU LMU A A A v C A v C A v B A v B T T B B C C B v C B v C (Levi 1996)

  23. Two Solutions CMU LMU A A A v C A v B A v C A v B T T B C B v C B B v C C (Levi 1996)

  24. Two Solutions CMU LMU A A A v C A v B T B v C B C B C (Levi 1996)

  25. Why Cars Look Different That’s junk! I want a smootherride! I want tighter handling! designer consumer

  26. Why Cars Look Different Grow up. We can optimize one or the other but not both. designer consumer

  27. Why Cars Look Different Grow up. We can optimize one or the other but not both. LMU steady CMU responsive designer consumer

  28. Why Cars Look Different = Change what you accept only when it is logically refuted. LMU steady = Track probabilistic conditioning exactly. CMU responsive

  29. End of Roundtable Segment 1 LMU Steadiness steady CMU responsive Responsive-ness

More Related