1 / 23

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers. Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org. tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life.

claire
Download Presentation

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Creating a New Intelligent Species:Choices and Responsibilities for AI Designers Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org

  2. tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life try to heal the sick incest taboos true distinguished from false mourning personal names dance, singing promises mediation of conflicts In Every Known Culture: (Donald E. Brown, 1991. Human universals. New York: McGraw-Hill.) Eliezer Yudkowsky Singularity Institute for AI

  3. ATP Synthase:The oldest wheel. ATP synthase is nearly the same in mitochondria, chloroplasts, and bacteria – it’s older than eukaryotic life. Eliezer Yudkowsky Singularity Institute for AI

  4. A complex adaptation must be universal within a species. Imagine a complex adaptation – say, part of an eye – that has 6 necessary proteins. If each gene is at 10% frequency, the chance of assembling a working eye is 1:1,000,000. Pieces 1 through 5 must already be fixed in the gene pool, before natural selection will promote an extra, helpful piece 6 to fixation. (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.) Eliezer Yudkowsky Singularity Institute for AI

  5. The Psychic Unity of Humankind(yes, that’s the standard term) Complex adaptations must be universal – this logic applies with equal force to cognitive machinery in the human brain. In every known culture: joy, sadness, disgust, anger, fear, surprise – shown by the same facial expressions. (Paul Ekman, 1982. Emotion in the Human Face.) (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.) Eliezer Yudkowsky Singularity Institute for AI

  6. Must… not… emote… Image: “The Matrix”

  7. Aha! A human with the AI-universal facial expression for disgust! (She must be a machine in disguise.) Images: (1) “The Matrix” (2) University of Plymouth, http://www.psy.plym.ac.uk/year3/psy364emotions/psy364_emotions_evolutionary_psychobiolog.htm Eliezer Yudkowsky Singularity Institute for AI

  8. Anthropomorphic hypothesis: Causes Eliezer Yudkowsky Singularity Institute for AI

  9. Same mistake, more subtle: Causes Eliezer Yudkowsky Singularity Institute for AI

  10. in nature we seewhat exists in us; in looks out, and findsfaces in the clouds...

  11. It takes a conscious effort to remember the machinery: Eliezer Yudkowsky Singularity Institute for AI

  12. tool making weapons grammar tickling sweets preferred planning for future sexual attraction meal times private inner life try to heal the sick incest taboos true distinguished from false mourning personal names dance, singing promises mediation of conflicts AI Nature: Eliezer Yudkowsky Singularity Institute for AI

  13. tool making weapons grammar tickling sweets preferred planning for future sexual attraction++ meal times private inner life heal sick humans snarkling taboos true distinguished from false mourning personal names dance, fzeeming promises mediation of conflicts AI Nature: Eliezer Yudkowsky Singularity Institute for AI

  14. Crimes against nonhumanityand inhuman rights violations: • cognitive enslavement • theft of destiny • creation under a low purpose • denial of uniqueness • hedonic/environmental mismatch • fzeem deprivation Eliezer Yudkowsky Singularity Institute for AI

  15. Happiness set points: • After one year, lottery winners were not much happier than a control group, and paraplegics were not much unhappier. • People underestimate adjustments because they focus on the initial surprise. (Brickman, P., Coates, D., & Janoff-Bulman, R. (1978). Lottery winners and accident victims: is happiness relative? Journal of Personality and Social Psychology, 37, 917-927.) Eliezer Yudkowsky Singularity Institute for AI

  16. “Hedonic treadmill” effects: • People with $500,000-$1,000,000 in assets say they would need an average of $2.4 million to feel “financially secure”. • People with $5 million feel they need at least $10 million. • People with $10 million feel they need at least $18 million. (Source: Survey by PNC Advisors. http://www.sharpenet.com/gt/issues/2005/mar05/1.shtml) Eliezer Yudkowsky Singularity Institute for AI

  17. Your life circumstances make little difference in how happy you are. “The fundamental surprise of well-being research is the robust finding that life circumstances make only a small contribution to the variance of happiness—far smaller than the contribution of inherited temperament or personality. Although people have intense emotional reactions to major changes in the circumstances of their lives, these reactions appear to subside more or less completely, and often quite quickly... After a period of adjustment lottery winners are not much happier than a control group and paraplegics not much unhappier.” (Daniel Kahneman, 2000. “Experienced Utility and Objective Happiness: A Moment-Based Approach.” In Choices, Values, and Frames, D. Kahneman and A. Tversky (Eds.) New York: Cambridge University Press.) Findable online, or google “hedonic psychology”. Eliezer Yudkowsky Singularity Institute for AI

  18. Nurture is built atop nature: • Growing a fur coat in response to cold weather requires more genetic complexity than growing a fur coat. (George C. Williams, 1966. Adaptation and Natural Selection. Princeton University Press.) • Humans learn different languages depending on culture, but this cultural dependency rests on a sophisticated cognitive adaptation: mice don’t do it. (John Tooby and Leda Cosmides, 1992. The Psychological Foundations of Culture. In The Adapted Mind, eds. Barkow, Cosmides, and Tooby.) Eliezer Yudkowsky Singularity Institute for AI

  19. Creationtranscendsparenting:An AI programmer stands,not in loco parentis,but in loco evolutionis. Eliezer Yudkowsky Singularity Institute for AI

  20. Tocreate a new intelligent species(even if it has only one member)is to create,not a child of the programmers,but a child of humankind,a new descendant of the familythat began with Homo sapiens Eliezer Yudkowsky Singularity Institute for AI

  21. If you didn’t intend to create a child of humankind, then you screwed up big-time if your “mere program”: • Starts talking about the mystery of conscious experience and its sense of selfhood. • Or wants public recognition of personhood and resents social exclusion (inherently, not as a pure instrumental subgoal). • Or has pleasure/pain reinforcement and a complex powerful self-model. Eliezer Yudkowsky Singularity Institute for AI

  22. BINA48 • By hypothesis, the first child of humankind • created for the purpose of a bloody customer service hotline (?!) • from the bastardized mushed-up brain scans of some poor human donors • by morons who didn’t have the vaguest idea how important it all was By the time this gets to court, no matter what the judge decides, the human species has already screwed it up. Eliezer Yudkowsky Singularity Institute for AI

  23. Take-home message: Don’t refight the last war. Doing right by a child of humankind is not like ensuring fair treatment of a human minority. Program children kindly; fair treatment may be too little too late. Eliezer Yudkowsky Singularity Institute for Artificial Intelligence singinst.org

More Related