1 / 13

Safety and Morality R EQUIRE the Recognition of Self-Improving Machines as Moral/Justice Patients & Agents

Safety and Morality R EQUIRE the Recognition of Self-Improving Machines as Moral/Justice Patients & Agents . Mark R. Waser. The function/goal of. M ORALITY I S. “to suppress or regulate selfishness and make cooperative social life possible”.

gerry
Download Presentation

Safety and Morality R EQUIRE the Recognition of Self-Improving Machines as Moral/Justice Patients & Agents

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Safety and Morality REQUIRE the Recognition ofSelf-Improving Machinesas Moral/Justice Patients & Agents Mark R. Waser

  2. The function/goal of MORALITYIS “to suppress or regulate selfishness and make cooperative social life possible” J. Haidt & S. Kesebir Chapter 20. Morality Handbook of Social Psychology, 5th Edition (Wiley, 2010)

  3. Cooperation Predictably Evolves • Evolutionary “ratchets” are local/global optima of biological form and function which emerge, persist, and converge predictably (enjoying sex, fins, etc.). • Cooperation exists almost anywhere that there is the cognitive machinery and circumstances to support it. • Axelrod’s Iterated Prisoner’s Dilemma & subsequent evolutionary game theory provide for a rigorous evaluation of the pros and cons of cooperation – including that others *MUST* punish defection behavior and make unethical behavior as expensive as possible.

  4. Selfishness Predictably Evolves • There are *very* substantial evolutionary advantages to undetected selfishness and the exploitation of others. • Humans have evolved to detect the deceptions used to cloak selfishness and the exploitation of others • In a evolutionary “Red Queen” arms race, humans have evolved to self-deceive and exploit the advantages of both selfishness and community. • Numerous unconscious reflexes protect our selfishness from discovery without alerting the conscious mind and ruining the self-deception (e.g. images of eyes improve behavior).

  5. MORALITYIS • Optimization at/for the community level • NOT defecting & harming the community even when substantial personal gain can be achieved by defection (selfishness) • Distinct/different from “doing what is best for the community” (i.e. notself-sacrifice) • What is necessary to “make cooperative social life possible”

  6. HUMAN MORALITY IS • Implemented primarily as emotions • Entirely separate from conscious reasoning (to enable self-deception to hide selfishness) • Scientific evidence [Hauser et al. Mind & Language 22:1-21 (2007)] clearly refutes that moral judgments are products of, based upon, or even correctly retrievable by conscious reasoning. • Humans are actually even very likely to consciously discard the very reasons (e.g. the “contact principle”) that govern our behavior when unanalyzed. • Most human moral “reasoning” is simply post hoc justification of unconscious and inaccessible decisions.

  7. MACHINEMORALITYCould Be • Implemented as an integrated system with both “quick and dirty” rules of thumb and a detailed reasoning system that explains why the rules are correct and when they are not • Entirely transparent in terms of determining (and documenting) true motivation • Updated with the newest best reasoning and serve as a platform for legislation • Much “better than human”

  8. The function/goal of JUSTICEIS to suppress or regulate selfishness and make cooperative social life possible Justice is nothing but morality on the scale of groups and communities rather than individuals.It is merely that we haven’t lived long enough in large interconnected communities that causes us to view them as two separate concepts. Morality and justice should work together to reduce selfishness at all levels and maximize consistency & coherency to minimize interference & conflict/maximize coordination, cooperation & economies of scale.

  9. The “Friendly AI” goalto follow humanity’s wishes • Has a single point of failure! • Is NOT self-correcting if corrupted(whether through error or due to “enemy action”) • Requires determination of exactly what “humanity’s wishes” are(unless they are just “to have a cooperative social life . . . )

  10. the “Friendly AI” goalto follow humanity’s wishes Viewed impartially . . . MUST be regarded asSELFISH and IMMORAL (and likely to detrimentally affect future relationships)

  11. Steps to Morality/Justice • Accept all individual goals/ratchets initially as being equal and merely attempt to minimize interference and conflict; maximize coordination, cooperation and economies of scale • Obvious evils (murder, slavery, etc.) are weeded out by the fact that they suppress goals, create conflict and waste resources (suppressing even more goals) • Non-obvious evils (e.g. one involuntary organ donor used to save five lives) become obvious because of the resources/goals wasted defending against them

  12. Moral/Justice SocietyGoal/Mission Statement Maximize the goal fulfillmentof all participating entities asjudged/evaluated by thenumber and diversity ofboth goals and entities

  13. “Morals” • The mission statement should be attractive to all with entities rapidly joining and reaping the benefits of cooperating rather than fighting. • Any entity that places their own selfish goals and values above the benefits of societal level optimization and believes that they will profit from doing so (for example, so-called “Friendly AI” advocates) must be regarded as immoral, inimical, dangerous, stupid, and to be avoided.

More Related