1 / 24

Interactive Systems Technical Design

Interactive Systems Technical Design. Seminar work: GESTURES Pekka Välitalo Daniel Pakkala Teemu Koponen. Introduction/Definitions. The use of motions of the limbs or body as a means of expression

lael
Download Presentation

Interactive Systems Technical Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Interactive Systems Technical Design Seminar work: GESTURES Pekka Välitalo Daniel Pakkala Teemu Koponen ISTD 2003, GESTURES

  2. Introduction/Definitions • The use of motions of the limbs or body as a means of expression • In a computer controlled environment, one wants to use the human hand as a manipulator, and its use in human-machine communication • Gestures originate as gesturer’s mental concept, possibly in conjuction with speech • Gestures are expressed through the motion of arms and hands, the same way speech is produced by air stream modulation through the human vocal tract • Observers perceive gestures as streams of visual images, which are interpret using the existing knowledge about those gestures ISTD 2003, GESTURES

  3. Motivation • As the computing, communication and display technologies process even futher •  The existing HCI (Human-Computer Interaction) techniques may become a bottleneck in the effective utilization of the available information flow • The use of hand gestures provide an alternative to cumbersome interface devices for HCI •  Visual interpretation of hand gestures can help in • achieving the ease and naturalness desired for it • More natural interfaces for humans to computing systems ISTD 2003, GESTURES

  4. Taxonomy of Hand Gestures for HCI Hand/Arm Movements Unintentional Movements Gestures Communicative Manipulative Acts Symbols Mimetic Deictic Referential Modalizing ISTD 2003, GESTURES

  5. Implementation ISTD 2003, GESTURES

  6. Implementation • Temporal Modeling • -Gesture Phases • -Preparation • -Nucleus • -Retraction • Spatial Modeling • -3D Hand/Arm Model • -Appearance-Based Model ISTD 2003, GESTURES

  7. Applications • Control of consumer electronics • Interaction with visualization systems • Control of mechanical systems • Computer games ISTD 2003, GESTURES

  8. Hand Gesture Control of TV Set and a Lamp • Three types of hand gestures: static hand posture, quantitative hand motion and qualitative hand motion.(In the prototype static hand postures used) • Lars Bretzner & Co, “A Prototype System for Computer Vision Based Human Computer Interaction”, Technical Report, Royal Institute of Technology, Stockholm, Sweden www.nada.kth.se/cvap/adstracts/cvap251.html. ISTD 2003, GESTURES

  9. In Detail 1. Images grabbed at frame rate 2. RGB -> new color space 3. Color feature detection ->Image features(blobs,ridges) 4. Tracking & Pose recognition -> pose, position,scale,orientation 5. Match 3.&4. Results with set of object hypothesis with statistical approach(particle filtering/condensation) 6. Application control (Recognized gestures bound with actions) ISTD 2003, GESTURES

  10. Strengths / Advantages • No actual physical remote controller needed • More natural interaction (?) • Relatively simple recognition technigue • Nice test case, but... ISTD 2003, GESTURES

  11. Limitations / Weaknesses • Usability: Slow response time • Solution: faster algorithms, more processing power • Static hand postures & skin colour • skin and background colour variations? • solution: recognition based on translations/trajectories • Two dimensional gesture recognition: angle of the hand has to be right in relation to the camera. • Solution: 3D hand model & recognition or multiple cameras/multiple models for same gesture ISTD 2003, GESTURES

  12. Industrial Players • Cybernet Systems Corporation (www.cybernet.com) • A company committed to research and commercialize technology solutions and advancements to human-machine interaction • UseYourHead: http://www.gesturecentral.com/useyourhead/ • Immersion (www.immersion.com) • SuperGlove: www.immersion.com/products/ 3d/interaction/cyberglove.shtml • Toshiba (www.toshiba.com) • Motion Processor: www.toshiba.com/news/980715.htm UseYourHead ISTD 2003, GESTURES

  13. Siemens (www.siemens.com) SIVIT (Siemens Virtual Touchscreen) Gesture Computers for the Disabled www.siemens.de/ct/en/technologies/ic/sivit.html www.siemens.com/index.jsp?sdc_p=po263380 Motion Processor SuperGloves ISTD 2003, GESTURES

  14. International Research Groupsand Projects • University of California at Berkeley, GUIR (Group for User Interface Research) • http://guir.berkeley.edu/ • http://guir.berkeley.edu/projects/quill/ • GMD - German National Research Center for Information Technology (Fraunhofer-Gesellschaft) • www.fraunhofer.de/english/ • Interactive Electronic Whiteboard: www.acoustics.hut.fi/icad2001/proceedings/ papers/muller-t.pdf • Ruhr-University of Bochum, Germany, Institut for Neuroinformatik • http://www.ruhr-uni-bochum.de/index_en.htm • http://www.neuroinformatik.ruhr-uni-bochum.de/top.html ISTD 2003, GESTURES

  15. Finnish Research Groupsand Projects • Helsinki University of Technology, Intelligent Machines and Special Robotics Institute • WorkPartner, service robot ISTD 2003, GESTURES

  16. WorkPartner • Robot control is based on human gestures and speech instead of keyboard and joystick • All operator hardware is included in the operator´s coat • The coat is a bright color textile including PC, speechmike and hand trackers. The bright color of the waistcoat is used to help the machine to recognize and track the operator and his gestures. ISTD 2003, GESTURES

  17. WorkPartner • Handtrackers are used if direct teleoperation of themanipulators is needed in teaching or gripping ofdifficult objects • The ring inthe end of the potentiometer wire is simply thread of the operator’s thumb • Hand positions are calculatedfrom the length and two direction angles of the wireand transmitted to the robot • More sophiticated (more complex) solutions for the robot control are under research and construction Operator controlling the right arm of WorkPartner with hand tracker ISTD 2003, GESTURES

  18. WorkPartner Source: http://www.automation.hut.fi/IMSRI/workpartner/ ISTD 2003, GESTURES

  19. University of Tampere, department of computer and information sciences, TAUCHI Group (Tampere Unit for Computer-Human Interaction) • Multimodal Interaction Group: Multimodal information kiosk interface • Kiosk can be used with hand gestures drawn on the surface of the screen. • www.cs.uta.fi/hci/ • www.cs.uta.fi/hci/mmig/kiosk/ ISTD 2003, GESTURES

  20. Tampere University of Technology, Department of Information Technology • www.cs.tut.fi/ • DMI (Digital Media Institute): www.tut.fi/dmi • Survey: www.cs.tut.fi/kurssit/8107210/ELEET.pdf ISTD 2003, GESTURES

  21. VTT Technical Research Centre of Finland, VTT Electronics • www.vtt.fi/ele/research/ais • Using the SoapBox developed by VTT to identify gestures and control the functions of the television. • www.vtt.fi/vtt/new/new121.htm • www.vtt.fi/ele/research/ els/projects/soapbox.html ISTD 2003, GESTURES

  22. Companies and Research Groups in Oulu • University of Oulu, Infotech Oulu • www.infotech.oulu.fi • MediaTeam: www.mediateam.oulu.fi/ publications/pdf/106.pdf • VTT Technical Research Centre of Finland, VTT Electronics • www.vtt.fi/ele/research/ais ISTD 2003, GESTURES

  23. Future Developments • Real-time interaction based on 3D model gesture analysis. • Human gestures, especially communicative, naturally employ actions of both hands  two-handed gestures • Multi-user interfaces • Simultaneous use of gestures, speech and gaze for multimodal control input Ideally, any and every gesture performed by the user should be unambiguously interpretable, thus allowing for naturalness of the interface. ISTD 2003, GESTURES

  24. Further Reading Pavlovic, V., I., Sharma, R., Huang, T., S., “Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol 19, No. 17, July 1997, pp 677-695. http://www.cs.rutgers.edu/~vladimir/pub/pami97.pdf Brezner, L., Laptev, I., Lindberg, T., Lenman, S., Sundblad, Y., “A Prototype System for Computer Vision Based Human Computer Interaction”, Technical Report, Department of numerical Analysis and Computing Science, KTH, Stockholm, Sweden ftp://ftp.nada.kth.se/CVAP/reports/cvap251.pdf Gesture Recognition Home Page: Lots of information about research groups and commercial products. http://www.cybernet.com/~ccohen ISTD 2003, GESTURES

More Related