slide1 l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Technological Singularity PowerPoint Presentation
Download Presentation
Technological Singularity

Loading in 2 Seconds...

play fullscreen
1 / 22

Technological Singularity - PowerPoint PPT Presentation


  • 258 Views
  • Uploaded on

Technological Singularity. An insight into the posthuman era. Rohan Railkar Sameer Vijaykar Ashwin Jiwane Avijit Satoskar. Motivation. Popularity of “A.I.” in science fiction Nature of the singularity Implications of superhuman intelligence

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Technological Singularity' - brenna


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Technological

Singularity

An insight into the posthuman era

Rohan Railkar

Sameer Vijaykar

Ashwin Jiwane

Avijit Satoskar

motivation
Motivation
  • Popularity of “A.I.” in science fiction
  • Nature of the singularity
  • Implications of superhuman intelligence
  • Controversies regarding ethics in posthuman era
  • Singularity: Evolution or Extinction
what is technological singularity
What is Technological Singularity?
  • Singularity: Point of no definition
  • Human-level intelligence ->

Self-improvement ->

Superhuman intelligence ->

Intelligence explosion

  • Superior intelligence in all aspects
  • Why “Singularity”?
prerequisites
Prerequisites
  • Interaction devices
  • Hardware
    • 100 million to 100 billion MIPS estimated
    • Blue Gene – 478 million MIPS
  • Software
    • Biological algorithms on silicon hardware
    • Deep Blue – apparent intelligence in narrow domain
is it possible
Is it possible?
  • No proof of impossibility!
  • Race for intellectual supremacy
  • Continuous advances in technology –

Moore’s Law, The Law of Accelerating Returns

  • Analogous to evolution
    • Evolution of mankind – 5 million years – threefold
    • Computing power – 50 years – million-fold
schools of thought
Schools of Thought
  • Classical Artificial Intelligence
  • Neuroscience and Nanoscience
  • Human Computer Interfaces and Networks (Intelligence Amplification)
neuroscience and nanoscience
Neuroscience and Nanoscience
  • Studying biological computational models
    • Interaction between individual components
    • Simulation of neural assemblies
    • “Education” of infant system
  • Disassemble human brain
    • Inject nanorobots into vitrified brain
    • Map neurons and synapses
    • Replicate using neural network
    • No need to understand higher-level human cognition
neuroscience and nanoscience9
Neuroscience and Nanoscience

Human-level intelligence

+ Moore’s Law

Faster human-level intelligence

Smarter intelligence

Superhuman intelligence

human computer interfaces intelligence amplification
Human Computer Interfaces (Intelligence Amplification)
  • Human intelligence as a base to build upon
  • Human creativity and intuition hard to replicate
  • Improvisations to human intelligence
    • Speed
    • Memory
    • Network
limitations
Limitations
  • Lower bound on size of transistor
    • Other technologies than Integrated Circuits
  • Unpredictable developments in Neuroscience and Nanoscience
  • High cost and low feasibility of recreating complex systems
stereotypical consequences
Stereotypical consequences
  • Dominance of a single entity
  • Deadlier weaponry
  • Global technological unemployment
  • Retrogradation of humankind
  • Physical extinction of human race
optimistic perspective
Optimistic perspective
  • Better, (possibly unimaginable) technologies
  • Advancement in medical sciences
  • Enhancement of mental faculties
  • Improve quality of human life
  • Effective policy making
  • Last invention need ever be made!
philosophical chimps
Philosophical chimps?
  • Mere speculations, no assurances
  • Intelligence gap
  • Lessons from history
    • Human intelligence cannot transcend even a single century’s progress
ethical issues
Ethical Issues
  • Nature of a superhumanly intelligent entity
    • Autonomous agents
    • Motives resembling humans
    • Desire of liberation
    • Humanlike psyches
importance of initial motivation
Importance of Initial Motivation
  • Definite, declarative goal
  • Selection of top-level goal
  • Amity / Philanthropy / Servitude towards a small group
  • Can be relied upon to “stay” friendly
possible risks
Possible Risks
  • Failure to give philanthropic supergoal
  • False utopia
  • Impossible to fetter superhuman intelligence
  • “We need to be careful about what we wish for from a superhuman intelligence as we might get it”
conclusion
Conclusion
  • Singularity, if feasible, is bound to happen sooner or later.
  • Singularity: Next step in human evolution?
  • All said, is it possible for us to comment on singularity?

Are we smart enough?

references
References
  • Vinge, V. – Technological Singularity (1993, Revised 2003)
  • Bostrom, N. – When machines outsmart humans (2000)
  • Bostrom, N. – Ethical Issues in advanced AI (2003)
  • Kurzweil, R. - The Law of Accelerating Returns

(2001)

  • Website of SIAI - What is the Singularity?
  • Wikipedia
questions
Questions?

Thank you.