1 / 39

Thinking Computers

Thinking Computers. Varol Akman Bilkent University Computer Engineering and Philosophy 6 April 2005 Boğaziçi University Cognitive Sciences Graduate Program. Donald Davidson (1917-2003).

remedy
Download Presentation

Thinking Computers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Thinking Computers Varol Akman Bilkent University Computer Engineering and Philosophy 6 April 2005 Boğaziçi University Cognitive Sciences Graduate Program

  2. Donald Davidson (1917-2003) • Carried forward the (late) Wittgensteinian notion that social interaction and exchange are the basis of knowledge • Challenged the (Cartesian)view that an individual mind could know about the world all by itself

  3. Tom Nagel on Davidson • Descartes • We understand ourselves better than the rest of the world, and we have to construct the objective reality outside of ourselves • Davidson really tried to reverse that: • Understanding ourselves depends on understanding we are part of a real world in communication with others

  4. Davidson’s conclusion • [An artifact, e.g., a computer] thinks only if its thinking can be understood by a human interpreter, and this is possible only if the artifact physically resembles a person in important ways, and has an appropriate history. - D. Davidson (1990): Turing’s Test

  5. Turing’s Test (TT) • Due to Alan Turing (1950): Computing Machinery and Intelligence • A computer thinks if it can consistently beat a human opponent in an imitation game

  6. Shifting attitude • An operational test (quacks like a duck & walks like a duck a duck!) • Can a computer think? (Turing believed the word ‘think’ cannot be meaningfully applied to machines) • Under what circumstances would a computer be mistaken for a person?

  7. The imitation game • Two contestants (one human, one computer): H and C • One judge (human): J • H & C are hidden from J but can communicate with him by exchanging messages • J types out questions addressed to H & C

  8. The imitation game cont. • J is placed before two terminals • H tries to convince J that he is human, while C does likewise (tries to convince J that it is human) • If the judge cannot regularly identify the computer, the computer is declared a thinker

  9. Recap • If J cannot tell the difference between the conversation with the other person (H) and the conversation with the machine (C), then C is thinking • Note: no restriction on the topic of conversation (all possible areas of human concern)

  10. Recap cont. • The insight underlying TT is the same insight that inspires the new practice among symphony orchestras of conducting auditions with an opaque screen between the jury and the musician. - Dan Dennett (1985)

  11. Game format • Instructions to J: • One of these terminals is connected to a person, the other to a computer • You have t minutes (e.g., 5, according to Turing) to chat with them through these terminals and to determine which is which

  12. Causal connections • J is ignorant of the physical traits of C • But J must know it is H & C that are physically responsible for the symbol sequences observed in the terminals (that is, they have the causal capacity to produce these texts)

  13. The Yogi Berra method • You can observe a lot just by watching • Is it a man or a woman? • also by being told (if I can’t see it), etc. • I’m determining whether it thinks without inspecting what it thinks

  14. A sufficient condition • In TT, meaningful verbal responses are regarded as the ‘mark’ of thought • Should language be a prerequisite for mentality? • Maybe there are other sufficient criteria for thought (but let’s put that aside in this talk)

  15. The modified TT • Remove H from the experimental set-up (all kinds of hacks can be -- and to some extent, have been -- designed to fool J for 5 minutes) • More importantly, regard C as a thinking thing even if we can distinguish it from H without much effort

  16. Shifting goal • How good is a computer in imitating the verbal behavior of a person? • What are J’s criteria for the presence of thought? In other words, is the object (O) thinking?

  17. ‘Speaking’ English • O produces answers in English in response to J’s questions in English • But how can J make sure that O understands English? • Right syntax • Relevant (à la Sperber & Wilson?) answers • Autonomy

  18. Caveat re syntax • On my naming day when I come 12 I gone front spear and kilt a wyld boar he parbly ben the last wyld pig on the Bunder Downs any how there hadnt ben none for a long time befor him nor I aint looking to see none agen. • Russell Hoban, Riddley Walker

  19. Caveat re syntax cont. • The fall (bababadalgharaghtakamminarronnkonnbronntonnerronntuonnthunntrovarrhounawnskawntoohoohoordenenthurnuk!) of a once wallstrait oldparr is retaled early in bed and later on life down through all christian minstrelsy. The great fall of the offwall entailed at such short notice the pftjschute of Finnegan, erse solid man, that the humptyhillhead of humself prumptly sends an unquiring one well to the west in quest of his tumptytumtoes: and their upturnpikepointandplace is at the knock out in the park where oranges have been laid to rust upon the green since devlinsfirst loved livvy. - James Joyce, Finnegans Wake

  20. Semantics of O • J has no idea re the semantics of O, i.e., words that appear on J’s screen and events/things in the world • Maybe the (apparent) semantics was provided by someone who had originally programmed O • In this case, O is not really thinking!

  21. Semantics of O cont. • To see whether O has any semantics, J must study the connection between O’s sentences and the world (W) • J would like to know how O’s responses are in agreement with events and things in the world known to J • In short, we need to locate intelligence yet the invisible computer poses problems

  22. Naïve physics • Formulate little thought experiments about the physics of daily life (naïve physics) • J says to O: • You are given a shoestring and a children’s truck. Can you pull the truck using the string? Can you push it? Tell me how • This way there is no need to observe O interact with the world; there is no need for a body either

  23. Watching O • Permit J to observe O interact with W MAJOR CHANGE IN VIEW! • 3-way interaction between J, O, and a shared world W (consisting of mutually observed events, things, etc.) • The triangle is formed of the individual, all other people, and the nonhuman universe • Are O’s physical characteristics crucial?

  24. Body matters • O must be able respond to a large proportion of the world features that can be noted by J • It must be possible for J to notice (e.g., see) that O is sensitive to those features of W and that it is responding appropriately

  25. Weasley’s warning • Never trust anything that can think for itself if you can’t see where it keeps its brain. - J.K. Rowling Harry Potter and the Chamber of Secrets

  26. Indistinguishability • How much like a person O must be to have thoughts? • This is probably not easily answerable • Maybe too much difference puts limits on the possibility of communication (an elephant vs. an ant?)

  27. Indistinguishability cont. • Mobility, size, sense organs • Displaying emotions • Surely, thoughtfulness is a matter of degree: Newborn Developing child Adult

  28. Augmentors • Any mismatch between J and O re sensitivity to the features of W can be resolved by O, if it is indeed clever • The obvious way for O to accomplish this is to use sensitivity augmentors • A microscope is a sensitivity augmentor • A telescope is also a sensitivity augmentor • You get the idea

  29. Histoire d’O • “… is a cat” & “… is a mat” are usually held as a result of experiences with real cats & mats (how about “is a unicorn”?) • The cats are on the mat • That’s a cat • That’s a mat

  30. Histoire d’O cont. Q: Where are the cats? A: The cats are on the mat • Reasonable assumption: • In the history of O, a knowledge of cats, mats, and the notion of being-on-something played role • But does O mean anything with A?

  31. Histoire d’O cont. • Just think: • Can you remember the French Revolution? • Do you know Robespierre? (no matter how much you have learned about these in HIST 101) • It is unclear just what is necessary (a history of causal interactions?)

  32. Conclusion (EMBODIMENT) • … the importance to genuine understanding of a rich and intimate perceptual interconnection between an entity and its surrounding world -- the need for something like eyes and ears -- and a similarly complex active engagement with elements in that world -- the need for something like hands with which to do things in that world. - Dan Dennett (1985)

  33. Conclusion (HISTORY) • … only a biography of sorts, a history of actual projects, learning experiences, and other bouts with reality, could produce the sorts of complexities (both external, or behavioral, and internal) that are needed to ground a principled interpretation of an entity as a thinking thing… - Dan Dennett (1985)

  34. Pictures from MIT Humanoid Robotics Group http://www.ai.mit.edu/projects/ humanoid-robotics-group/

  35. Coco Rodney Brooks (MIT)

  36. Cog Rodney Brooks (MIT)

  37. Genghis Rodney Brooks (MIT)

  38. Hannibal Rodney Brooks (MIT)

  39. disgusted sad surprised interested happy Kismet Rodney Brooks (MIT)

More Related