ARTIFICIAL INTELLIGENCE: THE MAIN IDEAS. OLLI COURSE SCI 102 Tuesdays, 11:00 a.m. – 12:30 p.m. Winter Quarter, 2013 Higher Education Center, Medford Room 226. Nils J. Nilsson. [email protected] http:// ai.stanford.edu/~nilsson /. Course Web Page: www.sci102.com/.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
OLLI’s Community Lecture Series is held on Wednesdays, from 1:00– 3:00pm, in Room A.
THIS WEEK: OLLI Poets: The Poetry of Aging. A group of OLLI poets known as Poet Scramble will talk about the poetry of aging, and will award prizes to winners of the poetry contest.
This will be the last Community Lecture for the Winter term.
Today is the last class.
No class on March 12!
Deep, Hierarchical Learning
Hidden Markov Models
Heuristic Search: A*
Implicit and Explicit Graphs
Bayes’ Rule, Bayesian Networks
Links to lecture slides (will be posted after each lecture)
(Movies and animations do not play in the pdf files, but they should work in the ppsx files.)
Jan 8 PDF Format (30 MB)
Jan 8 PPSX Format (24 MB)
Jan 15 PDF Format (9 MB)
Jan 15 PPSX Format (10 MB)
Jan 22 PDF Format (10.6 MB)
Jan 22 PPSX Format (9.6 MB)
Jan 29 PDF Format (5.3 MB)
Jan 29 PPSX Format (7.3 MB)
Feb 5 PDF Format (3.3 MB)
Feb 5 PPSX Format (35 MB)
Feb 12 PDF Format (4.7 MB)
Feb 12 PPSX Format (3.4 MB)
Feb 19 PDF Format (4.1 MB)
Feb 19 PPSX Format (45 MB) -- This link actually takes you to my Stanford site
Feb 26 PDF Format (24 MB)
Feb 26 PPSX Format (300 MB !)
Some Possibly Worrisome
Robots for Dangerous Jobs
Assisting the Ill or Infirm
More Useful Assistants and
Autonomous Military Robots
Autonomous Surveillance Drones
Automation of Most Jobs
The “Filter Bubble” and Privacy
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
“Difference Engine: Luddite Legacy”
The Economist, Nov 4, 2011
The End of Work: The Decline of the Global Labor Force and the Dawn of the Post-Market Era, Jeremy Rifkin, 1995
“Will a Robot Take Your Job?,” Gary Marcus,
The New Yorker, December 29, 2012
“Will Robots Steal Your Job?,” FarhadManjoo, Slate,
Sept. 26, 2011
Robot Ethics: The Ethical and Social Implications of Robotics (Intelligent Robotics and Autonomous Agents Series), Patrick Lin, et al., (Editors), 2012
Governing Lethal Behavior in Autonomous Robots,Ronald Arkin, 2012
“Intelligent technologies are changing every aspect of our lives. If designed with care and forethought, they have the potential to solve many of today’s problems. But if designed sloppily, they could be dangerous and harmful to humanity.”
See: http://selfawaresystems.com/ for more elaboration.
The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think,
Eli Pariser, 2011
A Pessimistic Analysis and Call to Action:
Berglas, Anthony (2008), “Artificial Intelligence will Kill our Grandchildren,” http://berglas.org/Articles/AIKillGrandchildren/AIKillGrandchildren.html
There are Some Things Computers Shouldn’t Do:
Joseph Weizenbaum, Computer Power and Human Reason: From Judgment To Calculation, 1976.
Theodore Roszak, The Cult of Information: A Neo-Luddite Treatise on High-Tech, Artificial Intelligence, and the True Art of Thinking, 1994.
“Rise of the Drones,” NOVA, January 23, 2013
“March of the Machines,” 60 Minutes, January 11, 2013