1 / 24

ROBOT

System Architecture for Human-Robot Interaction HUMAN ROBOT Robot Agent Human Agent A A A A A A A A IMA Primitive Agent Software System Hardware System Hardware Interface Human Interaction

salena
Download Presentation

ROBOT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. System Architecture for Human-Robot Interaction HUMAN ROBOT Robot Agent Human Agent A A A A A A A A IMA Primitive Agent Software System Hardware System Hardware Interface Human Interaction

  2. Welcome to the Center for Intelligent Systems at the Vanderbilt University School of Engineering. CIS conducts research on intelligent robotics and on intelligent manufacturing. • Research Activities and CIS Links: • Intelligent Robotics Lab • Intelligent Manufacturing • Recent Publications • CIS-Affiliated Faculty, Students, and Alumni • The CIS Newsletter • Employment Opportunities • Contact Information: • Center for Intelligent Systems, Vanderbilt University • Dr. Kazuhiko Kawamura, Director • Dr. Alan Peters, Assistant Director • Dr. Mitch Wilkes, Assistant Director • Florence (Flo) Fottrell, Administrator • Box 131 Station B • Nashville, TN 37235 • Phone: (615) 322-7269 (Lab), (615) 343-0697 (Office) • Fax: (615) 322-7062 • Other Links • U.S.-Japan Center Home Page • Vanderbilt University School of Engineering • IEEE RAS Service Robot Technical Committee

  3. Site Contents Research Projects under IRL Publications Papers Online People Faculty, Students, and Alumni Robot Links Link to Interesting Robot Sites

  4. Welcome Welcome to the Intelligent Robotics Laboratory (IRL) at Vanderbilt University's School of Engineering! The IRL is a part of the Center for Intelligent Systemsand conducts research on service robots and human/robot symbiosis. Site Contents Research Projects under IRL Publications Papers Online People Faculty, Students, and Alumni Robot Links Link to Interesting Robot Sites Contact Information: Director:Dr. Kazuhiko Kawamura Assistant Directors:Dr. M. WilkesDr. R.A. Peters IIResearch Faculty:Dr. G. BiswasDr. D. GainesDr. D. FisherDr. P.K. BasuAdministrator:Flo FottrellLab Manager:Mark CambronWeb Manager:Tamara Rogers Intelligent Robotics LabVanderbilt UniversityBox 131 Station BNashville, TN 37235Phone: (615) 322-7269Fax: (615) 322-7062

  5. Humanoids • ISAC • Mobile Robots • HelpMate • Bio-Mimetic Control Systems • Associative Memory • Attention System • Sensory Systems • Spreading Activation/Learning • High Level Agent Structure Site Contents • McKibben Artificial Muscles • Basics • ISAC Arms • Rehabilitation Robotics • Rehab Robotics • Industrial Automation • Intelligent Planners • Industrial Pick and Place Robot • Remote Manufacturing Systems • Intelligent Machine Architecture (IMA) • IMA • IMA II Anthropomorphic Manipulators • PneuHand • PneuHand II Robots & the Arts • Theremin Playing Climber Robot • Robin Research Projects under IRL Publications Papers Online People Faculty, Students, and Alumni Robot Links Link to Interesting Robot Sites

  6. ISAC is a dual-arm humanoid robot that was designed and built in the IRL as a research platform for service robotics. The system contains • Two pneumatic 6DOF SoftArms actuated by McKibben artificial muscles. • An air compressor and compressed air delivery system. • A Greifer gripper. • A four fingered, anthropomorphic dexterous manipulator, that we call the PneuHand, designed and built by the IRL. • Two force-torque sensors connected at the arm's wrist joints. • A Directed Perception pan-tilt platform modified in house for independent verge control of two color cameras. • Two 200 MHz Dual processor Pentium Pros. One controls grayscale image processing and the other controls the two SoftArms with two arm controller boards (built in house), and a multi-channel audio signal processor. • One 266 MHz Pentium-2 with two Imagenation color frame grabbers. • One 200 MHz Pentium Pro. The dual-arm system provides a test-bed to develop new technologies for user-to-robot and robot-to-user communications, including audio, visual, and gestural methods.

  7. The Intelligent Robotics Lab is currently working to incorporate a mobile robot with the ISAC system. The Helpmate mobile robot was donated by Yaskawa Electric of Japan. Helpmate has been upgraded with the following new features: • A 400MHz Pentium II motherboard. • A 5DOF rubbertuator-actuated softarm. • A Lidar sensor for navigation. • A vision system, including CATCH and a PCI color frame grabber. • New control software, based on IMA. • Connection to the Internet via wireless Ethernet • Helpmate will soon become an integral part of the ISAC system. A new software architechture (see related pages for IMA) will allow a combination of local autonomy and user direction, enabling Helpmate to navigate hallways and rooms to accomplish tasks. We are also using HelpMate as a test bed for IMA2, a revised version of IMA.

  8. What Helpmate looked like before we got a hold of it.

  9. A side/front view, showing the sonar arrays, and the arm just hanging there.

  10. This is a rear view, showing the DC-to-AC converter (the black box on the "tailgate"), the air compressor (that red pumpkin-looking thing), the servo valve tree (in the middle), and the manipulator.

  11. Previous SoftArm in a feeding task

  12. ISAC, our dual-arm humanoid, in its original configuration (with the Greifer gripper, the FMA gripper, and the original CATCH pan/tilt/verge head)

  13. A previous version of ISAC, with some of his tools.

  14. Recent Publications 1999 • D.M. Wilkes, W.A. Alford, R.T. Pack, T.E. Rogers, E.E. Brown, Jr., R.A. Peters II, and K. Kawamura, “Service Robots for Rehabilitation And Assistance", Chapter 2 in Teodorescu and Jain, “Intelligent Systems and Techniques in Rehabilitation”, CRC Press, 1999. • W. A. Alford, T. Rogers, D. M. Wilkes, and K. Kawamura, "Multi-Agent System for a Human-Friendly Robot", Proceedings of the 1999 IEEE International Conference on Systems, Man, and Cybernetics (SMC '99), pp. 1064-1069, October 12-15, 1999, Tokyo, Japan. • K. Kawamura, "Human-Robot Interaction for a Human-Friendly Robot: A Working Paper", Proceedings of the Second International Symposium on HUmanoid RObotics (HURO '99), pp. 77-85, October 8-9, 1999, Tokyo, Japan. • A. Alford, S. Northrup, K. Kawamura, K-W. Chan, "Music Playing Robot", Proceedings of the International Conference on Field and Service Robotics (FSR '99), pp. 174-178, August 29-31, 1999, Pittsburgh, PA.

  15. Recent Publications 1998 • S. Charoenseang, A. Srikaew, D.M. Wilkes, and K. Kawamura, "3-D Collision Avoidance for the Dual-Arm Humanoid Robot", IASTED International Conference on Robotics and Manufacturing, Banff, Canada, July, 1998 • D.M. Wilkes, A. Alford, R.T. Pack, T. Rogers, R.A. Peters II, and K. Kawamura, "Toward Socially Intelligent Service Robots", Applied Artificial Intelligence, An International Journal, vol. 12, pp. 729-766, 1998. • A. Srikaew, M.E. Cambron, S. Northrup, R.A. Peters II, D.M. Wilkes, and K. Kawamura, "Humanoid Drawing Robot", IASTED International Conference on R obotics and Manufacturing, Banff, Canada, July, 1998. • S. Charoenseang, A. Srikaew, D.M. Wilkes, and K. Kawamura, "Integrating Visual Feedback and Force Feedback in 3-D Collision Avoidance for a Dual-Arm Humanoid Robot", Proceedings of 1998 International Conference on Systems, Man and Cybernetics, California, USA, October, 1998.

  16. Recent Publications 1997 • D.M. Wilkes, R.T. Pack, W.A. Alford, and K. Kawamura, "HuDL, A Design Philosophy for Socially Intelligent Service Robots", working notes of the AAAI Symposium on Socially Intelligent Agents, November, 1997 • R.T. Pack, D.M. Wilkes, and K. Kawamura, "A Software Architecture for Integrated Service Robot Development", 1997 IEEE Conf. On Systems, Man, and Cybernetics, Orlando, pp. 3774-3779, September, 1997. • A. Alford, D. M. Wilkes, K. Kawamura, and R.T. Pack, "Flexible Human Integration for Holonic Manufacturing Systems", Proceedings of the World Manufacturing Congress, New Zealand, pp. 646-651, November, 1997. • R.T. Pack, D. M. Wilkes, G. Biswas, and K. Kawamura, "Intelligent Machine Architecture for Object-Based System Integration", Proceedings of the 1997 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Waseda University, Japan, June 1997.

  17. Human Agent • Motivated by desire for natural human-robot interaction • Encapsulates what the robot knows about the human • Identity • Location • Intentions

  18. Human Agent Internal Model • Model of the current human: description of the current human • Human activity: description of what the user is doing • User’s request: the nature of the interaction, the task the user request of the robot

  19. Model of the Human • Name • Stan • Emotion • Happy • Command • Watch me • Face Location • (x,y,z) = (122, 34, 205) • Hand Locations (x,y,z) = (85, -10, 175) (x,y,z) = (175, 56, 186)

  20. Model of the Human • Name • Stan • Emotion • Sad • Command • Watch me • Face Location • (x,y,z) = (122, 34, 205) • Hand Locations (x,y,z) = (85, -10, 175) (x,y,z) = (175, 56, 186) (x, y,z) Stan (x, y,z) (x, y,z)

  21. Human Agent Modules • Detection module • Monitoring module • Identification module

  22. Detection Module • Allows the robot to detect human presence • Uses multiple sensor modalities • IR motion sensor array • Speech recognition • Skin-color segmentation • Face detection

  23. Monitoring Module • Keeps track of the detected human • Localization and tracking algorithms • Face tracking • Finger pointing gesture • Basic speech interface

  24. Identification Module • Under development • Attempts to identify detected human based on stored model and current model • Voice pattern comparison • Name • Height • Clothing color • Detects changes in dynamic model • Clothing color • Height

More Related