1 / 24

CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS

CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS. ANTIQUITY TO ROBOTICS. “I would rather discover one causal relation than be King of Persia” Democritus (430-380 BC).

gil
Download Presentation

CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CAUSES AND COUNTERFACTUALS OR THE SUBTLE WISDOM OF BRAINLESS ROBOTS

  2. ANTIQUITY TO ROBOTICS “I would rather discoverone causal relationthan be King of Persia” Democritus (430-380 BC) Development of Western science is based on two great achievements: the invention of theformal logical system(in Euclidean geometry) by the Greek philosophers, and the discovery of the possibility to find outcausal relationships by systematic experiment(during the Renaissance). A. Einstein, April 23, 1953

  3. David Hume (1711–1776)

  4. HUME’S LEGACY Analytical vs. empirical claims Causal claims are empirical All empirical claims originate from experience.

  5. THE TWO RIDDLESOF CAUSATION • What empirical evidence legitimizes a cause-effect connection? • What inferences can be drawn from causal information? and how?

  6. The Art ofCausal Mentoring “Easy, man! that hurts!”

  7. OLD RIDDLES IN NEW DRESS • How should a robotacquirecausal • information from the environment? • How should a robotprocesscausal • information received from its • creator-programmer?

  8. CAUSATION AS A PROGRAMMER'S NIGHTMARE • Input: • “If the grass is wet, then it rained” • “if we break this bottle, the grass • will get wet” • Output: • “If we break this bottle, then it rained”

  9. CAUSATION AS APROGRAMMER'S NIGHTMARE (Cont.) ( Lin, 1995) • Input: • A suitcase will open iff both • locks are open. • The right lock is open • Query: • What if we open the left lock? • Output: • The right lock might get closed.

  10. Correct notation: Y := 2X X = 1 Y = 2 The solution BRAINLESS FIRST DISCOVERY: PHYSICS DESERVES A NEW ALGEBRA Scientific Equations (e.g., Hooke’s Law) are non-algebraic e.g., Length (Y) equals a constant (2) times the weight (X) Y = 2X X = 1 Process information Had X been 3, Y would be 6. If we raise X to 3, Y would be 6. Must “wipe out” X = 1.

  11. e.g., MATHEMATICAL EXTRAPOLATION: THE WORLD AS A COLLECTION OF SPRINGS • Definition: A structural causal model is a 4-tuple • V,U, F, P(u), where • V = {V1,...,Vn} are endogeneas variables • U={U1,...,Um} are background variables • F = {f1,...,fn} are functions determining V, • vi = fi(v, u) • P(u) is a distribution over U • P(u) and F induce a distribution P(v) over observable variables

  12. FAMILIAR CAUSAL MODEL ORACLE FOR COUNTERFACTUALS X Y Z INPUT OUTPUT

  13. The Fundamental Equation of Counterfactuals: BRAINLESS SECOND DISCOVERY: COUNTERFACTUALS ARE EMBARRASINGLY SIMPLE Definition: The sentence: “Y would be y (in situation u), had X beenx,” denoted Yx(u) = y, means: The solution for Y in a mutilated model Mx, (i.e., the equations for X replaced by X = x) with input U=u, is equal to y.

  14. Joint probabilities of counterfactuals: In particular: BRAINLESS SECOND DISCOVERY: COUNTERFACTUALS ARE EMBARRASINGLY SIMPLE Definition: The sentence: “Y would be y (in situation u), had X beenx,” denoted Yx(u) = y, means: The solution for Y in a mutilated model Mx, (i.e., the equations for X replaced by X = x) with input U=u, is equal to y.

  15. THE STRUCTURAL MODEL PARADIGM Joint Distribution Data Generating Model Q(M) (Aspects of M) Data M Inference • M – Invariant strategy (mechanism, recipe, law, protocol) by which Nature assigns values to variables in the analysis. “Think Nature, not experiment!”

  16. THE PUZZLE OF COUNTERFACTUAL CONSENSUS • Indicative: “If Oswald didn’t kill Kennedy, someone else did,” • Subjunctive: “If Oswald hadn’t killed Kennedy, someone else would have.” • (Adams 1975)

  17. THE PUZZLING UBIQUITY OF COUNTERFACTUALS • Hume’s Definition of “cause”: • We may define a cause to be an object followed by another, • and where all the objects, similar to the first, are followed by • objects similar to the second, Or, in other words, where, if the • first object had not been, the second never had existed • (Hume 1748/1958, sec. VII). • Lewis’s Definition of “cause”: • “Ahas caused B” if “B would not have occurred if it were not for A (Lewis 1986). • Why not define counterfactuals in terms of causes? • (Pearl 2000)

  18. Structural account (1995): “Ywould be y if X were x” is true in situation u just in case STRUCTURAL AND SIMILARITY-BASED COUNTERFACTUALS w A B Lewis’s account (1973): The counterfactual “B if it were A” is true in a world w just in case B is true in all the closest A-worlds to w.

  19. P(SE) = P (SE) P(SE) = 1 M M SE OS = true true OS OS OS OS SE SE SE true true false false K K K true true true true M M M M M OS SE OS OS SE Realizing Oswald did not kill Kennedy Oswald killed Kennedy Prior knowledge S1: “IF OSWALD DIDN’T KILL KENNEDY, SOMEONE ELSE DID” P (SE)

  20. P (SE) = P(SE) true = true M SE OS OS SE SE true K K true M M M M M M SE SE OS OS OS OS After learning Oswald killed Kennedy S2: “IF OSWALD HADN’T KILLED KENNEDY, SOMEONE ELSE WOULD HAVE?” P (SE) SE OS K Oswald refraining from killing Prior knowledge

  21. P (SE) = P(SE) = true OS SE true K true M M M M M false SE SE OS OS OS S2: “IF OSWALD HADN’T KILLED KENNEDY, SOMEONE ELSE WOULD HAVE?” P (SE) P (SE) = P(SE) true M SE OS SE OS SE K K Oswald refraining from killing After learning Oswald killed Kennedy Prior knowledge

  22. BRAINLESS THIRD DISCOVERY: HIGH SCHOOL COUNTERFACTUALS CAN BE USEFUL • Solidify and unify (all?) other approaches to causation • (e.g., PO, SEM, DA, Prob., SC) • Demystify enigmatic notions and weed out myths and misconceptions • (e.g., ignorability, exogeneity, exchangeability, • confounding, mediation, attribution, regret) • Algorithmitize causal inference tasks • (e.g., covariate-selection, identification, c-equivalence, • effect-restoration, experimental integration, sufficiency) • Resolve lingering puzzles

  23. CONCLUSIONS • If Oswald had not used counterfactuals, brainless would have. • Much of modern thinking is owed to brainless robots. • I compute, therefore I think.

More Related