1 / 43

Lecture 4: Embedded Conditionals, Uncertainty and Indeterminacy

Lecture 4: Embedded Conditionals, Uncertainty and Indeterminacy. Dorothy Edgington Paris 2019. Principal virtue.

mickey
Download Presentation

Lecture 4: Embedded Conditionals, Uncertainty and Indeterminacy

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lecture 4: Embedded Conditionals, Uncertainty and Indeterminacy Dorothy Edgington Paris 2019

  2. Principal virtue • of a suppositional theory: it gives a good account of the fact that our conditional judgements are often uncertain. You can be closer to or further from certain that Jane will accept if offered the job, that your back pain will be cured if you have the operation, etc.. All propositional theories of conditionals give bad results for uncertain conditional judgements.

  3. Principal vice • No account of how to assess embedded conditionals, either in terms of truth values or in terms of probabilities • ‘It’s more probable than not that if the gardener did it he used a spade and if the cook did it he used a knife.’ No theory, either of truth conditions or probability, for this. • We don’t even have a theory of negation of conditionals.

  4. Negation • One can propose to read ‘It’s not the case that if A, B’ as ‘If A, then ¬B’; others have suggested that it is rather ‘If A, then it might be that ¬B’. I think the latter is wrong and the former is OK. Consider: • Will Jane accept if she is offered the job? • I think so; but she might not (accept if she’s offered the job).

  5. 1st semantic proposal • De Finetti 1935, reinvented by Belnap 1970, et al. See Milne 1997; McDermott 1998; and more. • AB is true if A&B, false if A&¬B, undefined if ¬A. • A B AB • T T T • T F F • F T U • F F U

  6. A B AB ¬A A&B AvB T TT F T T T F F T F T T U U F U T F T U T F T F F U T F F F U U T F U U T U UU T U F U U F U U UUUUU

  7. De Finetti continued The probability of a conditional is not the probability that it’s true (it is true iff A&B) but the probability that it’s true given that it’s defined, i.e. is either true or false, i.e. the probability of B given A. It is no fault in a conditional that it is not true, for it’s no fault in a conditional that it has a false antecedent. I say ‘If you press that button there will be an explosion’. A disaster is avoided because, fortunately my remark is not true (you don’t press it). One might worry that the normative dimension of truth is lost

  8. De Finetti continued: more worries • Even a necessary conditional like ‘If A&B then A’ can fail to be true. • Validity can’t be necessary preservation of truth, for if it were. ‘If A, B; so A&B’ would be a valid argument. There are more decisive objections when we consider some embedded conditionals on this theory:

  9. Problem 1 • (1) ((AB)&(¬AC)) • E.g. mother says ‘If it rains tomorrow we’ll go to the cinema, and if it doesn’t rain we’ll go to the beach’. ((RC)&(¬RB)). You’re pretty confident of this. • (1) can’t be true; and it might be false in the unlucky event that R&¬C, or ¬R&¬B. • So the probability that it’s true, given that it has a truth value, is 0. That’s crazy.

  10. Problem 2. Valid on this proposal: • (A&B)C; therefore (AC) v (BC) • ‘If it’s a triangle and it’s equi-angular, it’s equilateral; therefore, either, if it’s a triangle it’s equilateral, or, if it’s equi-angular, it’s equilateral. 99 99 1

  11. 2nd semantic proposal • van Fraassen(1975), McGee (1989), Jeffrey (1991), Jeffrey and Stalnaker (1994). We should be more selective about the third value. Let the semantic value of AB be 1 (= true) if A&B; 0 (= false) if A&¬B; and let its semantic value if ¬A be p(BA). The ‘probability’ of AB is not the probability of its truth, but its ‘expected value’.

  12. 2nd proposal • A B AB • T T 1 • T F 0 • F p(BA)

  13. Example • 50% of the balls are red and 80% of the red balls have a black spot. • ‘If I pick a red ball (R) it will have a black spot(B)’ gets 1 if R&B, 0 if R&¬B. What if ¬R? Who knows?—but 80% of the R possibilities are B-possibilities, so it is (as it were) “80% true” if ¬R. • P(RB) = p(R&B)x1 + p(R&¬B)x0 + p(¬R)x0.8 • = (0.4x1) + (0.5x0.8) = 0.8 = p(BR)

  14. 2nd proposal continued Many of the bad results of the first proposal are avoided. Necessary conditionals like A&BA always come out true. The switches paradox is invalid. Conjunctions like (AB)&(¬AC) don’t all get probability 0. Indeed it is provable that conjunctions of that form always get probability p(BA).p(C¬A). This, however, is a source of problems.

  15. (AB)&(¬AC) • On this proposal, when the antecedents are incompatible, e.g. A and ¬A, the probability of the conjunction is the product of the probabilities of the conditionals. • Sketch of proof: there are 2 ways the conjunction can get a non-zero value: (1) A&B, in which case the first conjunct gets 1 and the second gets p(C¬A) so the conjunction gets p(C¬A); (2) ¬A&C, so the conjunction gets p(BA). So the expected value is p(A&B).p(C¬A) + p(¬A&C).p(BA) = P(A).p(B/A)p(C/¬A) + p(¬A)pC/¬A).p(B/A) = p(B/A)p(C/¬A)

  16. When A and C are incompatible: p(AB).p(CD) Simplest case of this: C = ¬A and B = D So p((AB)&(¬AB)) = p(AB)p(¬AB) Counterexample. Let A be irrelevant to B, so p(B) = p(AB) = p(¬AB). You’re worried about missing our plane connection. I think we’ll make it. You suggest crossing your fingers. I say, ‘it’s likely that we’ll make it, whether or not you cross your fingers: if you cross your fingers, we’ll make it, and if you don’t cross your fingers we’ll make it’. Let’s say my p(B) = 0.7. The conjunction should get 0.7, not 0.49.

  17. Another due to Mark Lance • Concerns a werewolf, such that it’s 50% likely that it is in our area tonight. If it is it will kill everyone outside. ‘If John went out he was killed’ gets 0.5. ‘If John went out the front door he was killed, and if John went out the back door he was killed’ gets only 0.25, whereas it should get 0.5

  18. Another counterexample • I must pick one of two urns, only one of which contains a prize. It’s 50-50 which urn contains the prize. ‘If I pick the one on the right, I’ll win’ gets 0.5. ‘If I pick the one on the left, I’ll win’ gets 0.5. ‘If I pick the one on the right I’ll win and if I pick the one on the left I’ll win’, on this proposal gets 0.25, but surely it should get 0.

  19. What has gone wrong • with this proposal is that it gives the conditional with the false antecedent the same value in all worlds in which the antecedent is false. A world in which I pick Right and win is a world in which If I had picked left I’d have won’ gets 0, not ½. A world in which you cross your fingers and we make our connection is a world in which ‘If you had not crossed your fingers we would have made our connection’ gets 1, not 0.7.

  20. Principled objections • In any case, this proposal deals with a very strange 3-valued entity: a weighted average of truth values (which we call 1 and 0), in the case where the antecedent is true; and one’s subjective probability for B given A, in the case where the antecedent is false. Also, the semantic value is unstable, changing if you change your, differing between people with different beliefs. As before the probability of the conditional is not the probability that it is true.

  21. 3rd proposal: Richard Bradley (2012) • We shall return to Stalnaker: a conditional `if A, B’ is true iff the B is true in the “nearest” A-world (which may or may not be the actual world). We amend Stalnaker is three ways: (1) we abandon the notion of similarity in terms of a probability distribution over the A-worlds. • (So “nearest” does not mean “most similar”. It really means “the world that would be actual if A were true”. Though, as we shall see it can be indeterminate which world that is.)

  22. 2nd and 3rd amendments • (2) The conditional is not treated as a proposition. It obeys different rules from propositions • (3) We take seriously the fact that it might be indeterminate which world would have been actual if A were true. (Stalnaker himself says the conditional is neither true nor false if this is so. We say it is still perfectly in order to have a probability distribution over the candidate A-worlds.

  23. Why similarity is the wrong notion • A straw is to be selected. 90% • 1% • 9% Top: 10cms; next: 11cms; bottom: 20cms. ‘If it’s over 10 cms. long, it’s less than 15 cms. long.’

  24. Amendment 2 • Conditionals are not propositions, i.ethey are not categorical statements about how things are. Conditionals involve two propositions which play different roles, one a supposition, one a judgement within its scope. They cannot be represented by the set of worlds in which they are true. Indeed conditionals are not ‘in’ worlds—they are cross-world entities. They obey different rules from propositions.

  25. Amendment 2 continued • Bradley proposes that the conditional AB can be represented by the set of pairs of worlds, <wi, wj> such that if wi is the actual world, and wj is the world that would be actual if A were true, B is true.

  26. Two types of uncertainty • are involved in assessing a conditional AB: uncertainty about the facts—about which world is actual; and uncertainty about what would be the case if some supposition were true. We have a probability distribution over the facts; and we have a probability distribution over the candidate A-worlds. These combine into a joint probability distribution over the order pairs.

  27. AB • There are 3 possible worlds; but four possibilities. • w1 A,B <w1,w1> T • w2 A,¬B <w2,w2> F • w3 ¬A <w3,w1> T • <w3,w2> F

  28. Two rules governing this entity • First, centering: if A is true, the ‘nearest’ A-world is the actual world. The conditional is true/false depending on whether B is actually true/false. • Second: the probability of AB given A is the same as the probability of AB given ¬A; the probability of the conditional is independent of its antecedent.

  29. Comments • (1) This is enough to guarantee that p(AB) = p(B/A) • (2) This is a weaker independence condition than was built into the last proposal, and is immune from the counterexamples • (3) No treatment of the conditional an ordinary proposition has this result

  30. Old picture There are 4 possible worlds; • AB • w1 A,B <w1,w1> T • w2 A,¬B <w2,w2> F • w3 ¬A <w3,w1> T • w4 ¬A <w3,w2> F

  31. (AB)&(¬AB) • If you cross your fingers we’ll make our connection and if you don’t cross your fingers we’ll make our connection. • 4 possible worlds • We need ordered triples <wi, wj, wk> such that wi is actual and wj is the nearest A-world and wk is the nearest ¬A-world. • One possibility is the actual world is A, B. Then we give 1 to the nearest ¬A-world being B, and 0 to the nearest ¬A-world being ¬B. That’s enough to show how it differs from proposal 2 which gives ¬AB the same value, p(BA) in all worlds in which A is false.

  32. Suppose p(BA) = p(B¬A) = p(B) = 0.8, and let p(A) = 0.5 What about p((AB)&(¬AB)) (call it C)? (NB this slide is here merely to show how to display the truth conditions of the conjunction.) To assess this, we need ordered triples, <wi, wj, wk> which represent the possibility that wi is actual, wj is the nearest A-world and wk is the nearest ¬A-world. Given centering, we have the following possibilities: AB ¬AB C w1 (0.4) A&B <w1, w1, w3> T T T <w1, w1, w4> T F F w2 (0.1) A&¬B <w2, w2, w3> F T F <w2, w2, w4> F F F w3 (0.4) ¬A&B <w3, w1, w3> T T T <w3, w2, w3> F T F w4 (0.1) ¬A&¬B <w4, w1, w4> T F F <w4, w2, w4> F F F

  33. 3rd amendment: indeterminacy • If I pick a red ball it will have a black spot. (90%) • If I had picked a red ball, it would have had a black spot. • If you approach the dog will bite you. (v. likely) • If I had approached the dog would have bitten me. • If you have the operation you will be cured. (90%) • If you had had the operation, you would have been cured.

  34. Reasons for no determinate truth value • 1. Indeterminism • 2. Determinism but still no saying exactly what my hand movements would have been • 3. Determinism but vocabulary of conditional not suited to subsumption under deterministic laws. • 4. Determinism but antecedent highly general: ‘If I had had another child it would have been a girl’

  35. Fits well with my views on another kind of indeterminacy: vagueness • I give ‘degree of closeness to clear truth’ probabilistic structure. • (One motivation: analogy between the lottery paradox and the sorites paradox.) • The probabilistic aspect of this counterfactual indeterminacy lends added support to giving degrees of closeness to clear truth probabilistic structure. It’s obviously probabilistic in the counterfactual case. There can be probabiities without outcomes

  36. The approach to truth conditions that I and many others share about vagueness supports Bradley’s approach to truth conditions for conditionals even in the presence of indeterminacy.

  37. Truth conditions for a borderline case: • Red or orange—but not determinate which. • Red or not red—but not D which • True or false that it’s red—bot not D which • You give truth conditions in the usual way for e.g. Either it’s small and red, or it’s heavy But it may be indeterminate which line of the truth table obtains, i.e what the truth value of the statement is. • There is a last noonish second—but not D which it is.

  38. continued • 1 second after noon is noonish. 10000 seconds after noon is not noonish. • Therefore, it’s not the case that for all n, if n seconds after noon is noonish, n+1 seconds after noon is noonish. • Therefore, for some n, n seconds after noon is noonish and n+1 seconds after noon isnot noonish. • Therefore, there is a last noonish second—but it is not determinate which second that is.

  39. More on indeterminacy There is a last noonish second, a shortest tall man, etc., but it is indeterminate which. Similarly, there is way the world would have been if A were true, but it is (often) indeterminate which. But it makes perfect sense to have a probability distribution over the candidates.

  40. Advantages of this approach • Smooth theory of negations, conjunctions and disjunctions of conditionals, of conditionals embedded in conditionals, and quantification. • Probability is probability of truth. • Validity is necessary preservation of truth, and thus Adams’s probabilistic criterion of validity is demonstrably satisfied. Thus, the uncertainty of a conjunction can’t exceed the sum of the uncertainties of the conjuncts; a disjunction of conditionals cannot be more probable than the sum of the probabilities of the disjuncts. • If B is true in/false in all A-worlds, the conditional is straightforwardly true/false. Plenty others may be straightforwardly true/false. • The many uncertain or indeterminate ones come out with the right probability. We needn’t have a bad conscience about saying ‘that’s true/that’s false’ or ‘that’s probably true/probably false’

  41. Disadvantage • The construction is immensely complicated to work with. • Richard Bradley, ‘Multi-dimensional Possible-World Semantics for Conditionals’. Philosophical Review 2012 • But it is a sort of possibility proof, which shows that the embedding objection can be met.

  42. In practice • Maybe we just rely on the constraints of the probabilistic theory of validity, and other intuitive constraints: the probability of a conjunction can’t exceed that of the conjuncts; • u((AB)&(CD) ≤ u(AB) + u(CD); (where u, uncertainty, is 1 – probability) • a few more special cases: if the antecedents are the same the probability of the conjunction is p(B&C/A); if p(AB) = 1 the probability of the conjunction is p(D/C), etc..

  43. Thank you!

More Related