1 / 28

비전기반 HRI

비전기반 HRI. 한양대학교 대학원 지능형로봇학과. 교과목 소개. 교과목 개요 및 수업목표 기본적인 비전 기술에 대한 이론적 이해 과제 학습을 통한 비전 기술 구현 비전기반 HRI 최신 동향 및 연구 테마 중심의 논문 Survey 개인별 프로젝트 수행을 통한 응용 능력 배양 수업 운영 방안 수강 대상 영상처리 이론을 이해하고 직접 구현 할 수 있는 프로그래밍 능력을 갖춘 학생을 대상 교재 및 수업 시간 부교재 : Multiple View Geometry in Computer Vision

fox
Download Presentation

비전기반 HRI

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 비전기반 HRI 한양대학교 대학원 지능형로봇학과

  2. 교과목 소개 • 교과목 개요 및 수업목표 • 기본적인 비전 기술에 대한 이론적 이해 • 과제 학습을 통한 비전 기술 구현 • 비전기반 HRI 최신 동향 및 연구 테마 중심의 논문 Survey • 개인별프로젝트 수행을 통한 응용 능력 배양 • 수업 운영 방안 • 수강 대상 • 영상처리 이론을 이해하고 직접 구현 할 수 있는 프로그래밍 능력을 갖춘 학생을 대상 • 교재 및 수업 시간 • 부교재 : Multiple View Geometry in Computer Vision • 화요일 15:30 ~ 18:30 한양대학교 대학원 비전기반 HRI

  3. 평가 및 기타 사항 • 이론 시험 • 미실시 • 실습 과제 : 20% • 수업시간에 제시한 과제에 대한 리포트 • 출석 : 10 % • 결석 2회 이상 회당 감점 적용 • 수업참여도 : 20 % • Presentation 등의 수업 참여 • 기타 : 50 % • 개인별 프로젝트 성과에 따른 평가 • 강의홈페이지 • http://kain.hanyang.ac.kr/gs/hri 한양대학교 대학원 비전기반 HRI

  4. 강사 소개 • 정승도 • 한양사이버대학교 정보통신공학과 교수 • OFFICE • 한양사이버대학교사이버 2관 501호 • Tel: 2290-0313 • E-mail • sdjeong@hanyang.ac.kr • sdjeong@hycu.ac.kr 한양대학교 대학원 비전기반 HRI

  5. 비전기반 HRI • Doctoral School – Robotics Program • Autonomous Robots Class • Spring 2008 • Human-Robot Interaction • Aude G Billard • Learning Algorithms and Systems Laboratory - LASA • EPFL, Swiss Federal Institute of Technology • Lausanne, Switzerland

  6. WHY HRI Stair-Climbing Wheelchair "HELIOS-III Hirose’s lab Ri-Man robot from Riken Why is HRI beneficial? Assistance for the elderly and disabled Assistance with routine tasks which cannot be fully automated Service Robots Entertainment Robots 한양대학교 대학원 비전기반 HRI

  7. ISSUE in HRI Adaptive Systems Group, UH Interaction Lab, USC • Safety • Semi-structured/unstructured environment • Untrained users • Perception • Perception of the environment • Perception of the user • Perception of the user’s .intent. • User Friendly Interaction 한양대학교 대학원 비전기반 HRI

  8. Interfaces & Interaction modalities • To communicate effectively with humans, robots should be able to perceive and interpret a wide range of communicative modalities and cues. • Types of interactive human-robot interfaces that are most meaningful and expressive for collaborative scenarios • gestures based interfaces • non-verbal emotive interfaces • emotion-based • sound based interfaces • computer based interfaces 한양대학교 대학원 비전기반 HRI

  9. Multi-modal Means of Interaction Human-robot interaction requires the use of more than one modality at a time. Multiple sources of information provide redundant information, which helps recover from noise in each source. 한양대학교 대학원 비전기반 HRI Steil et al, Robotics & Autonomous Systems 47:2-3, 129-141, 2004

  10. Multi-modal Means of Interaction R. Dillmann, Robotics & Autonomous Systems 47:2-3, 109-116, 2004 한양대학교 대학원 비전기반 HRI

  11. PDA-Based Interfaces PdaDriver, a Personal Digital Assistant (PDA) interface for remote driving. PdaDriver is designed to let any user (novice or expert alike) to remotely drive a mobile robot from anywhere and at anytime. Compaq iPAQ (PocketPC 2000/2002, WindowsCE 3.0) 802.11b (WiFi) wireless data link. T. Fong, C. Thorpe, and C. Baur, Multi-Robot Remote Driving with Collaborative Control, IEEE Transactions on Industrial Electronics 50(4), August 2003. A.G. Billard, Autonomous Robots Class – EDPR/EDIC

  12. PDA-Based Interfaces Displays live images from a camera located on the robot. The user can pan and tilt the camera by clicking in the grey camera control box. Yellow lines shown on the image indicate the projected horizon line and robot width. The user drives the robot by clicking a series of waypoints on the image and then pressing the go button. As the robot moves, the progress bar displays the robot's progress. T. Fong, C. Thorpe, and C. Baur, Multi-Robot Remote Driving with Collaborative Control, IEEE Transactions on Industrial Electronics 50(4), August 2003. A.G. Billard, Autonomous Robots Class – EDPR/EDIC

  13. PDA-Based Interfaces Terrain visualization system: transforms data collected from robot sensors into a 3D graphical map, which is continuously updated and displayed on the PDA. Converts range data from a Sick LMS-220 radar into VRML using a line stripe classifier and a regular quadrilateral mesh generator. Obstacles and the robot's path (traversed and planned) are color coded. T. Fong, C. Thorpe, and C. Baur, Multi-Robot Remote Driving with Collaborative Control, IEEE Transactions on Industrial Electronics 50(4), August 2003. A.G. Billard, Autonomous Robots Class – EDPR/EDIC

  14. Tracking of Hand Motion and Facial Expression • Other modes of interactions currently in development but which may become common in the future are: • Tracking of hand/finger motion either through vision or through data gloves • Tracking and recognition of facial expressions A.G. Billard, Autonomous Robots Class – EDPR/EDIC

  15. Haptics Haptics provides information in terms of force applied by the device handled by the human demonstrator. The forces are directly transmitted by the robotic device and can thus be reapplied in an autonomous manner after the demonstration. The Pantoscope for Laparoscopy Surgery Simulation (LSRO- EPFL), the Microsurgery Trainer developed at EPFL and National UNiv. of Singapore (NUS). A.G. Billard, Autonomous Robots Class – EDPR/EDIC

  16. Brain-machine Interfaces • Brain-computer interfaces recognize the subject’s voluntary intent through EEG waves. • These can be interpret as states and be used to convey the user’s intention to a robot on the order of milliseconds. José del R. Millán, Frédéric Renkens, Josep Mouriño and Wulfram Gerstner, Noninvasive Brain-Actuated Control of a Mobile Robot by Human EEG, IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. 51, NO. 6, JUNE 2004 A.G. Billard, Autonomous Robots Class – EDPR/EDIC

  17. DESIDERATA FOR THE INTERFACES • The interfaces must be user-friendly and require as little prior knowledge as possible. • They are meant for lay people, children or disabled people. • They must provide “natural” interactions, i.e. types of interactions similar to those humans have with one another • Tradeoff between high accuracy (wearable instruments) and non-intrusive system • The interfaces and interactions must be multimodal • The interfaces must be safe 한양대학교 대학원 비전기반 HRI

  18. Modes of interactions considered sofar are: • vision • proprioception • speech. • Types of interactions considered sofar are: • recognition and detection of faces • recognition and detection of facial expressions, • tracking of body and hand motion, • measurement of forces • measurement of brain signals 한양대학교 대학원 비전기반 HRI

  19. Image warping

  20. h(x,y) y’ x x’ f(x,y) g(x’,y’) Image warping (1) • Given a coordinate transform (x’,y’)=h(x,y) and a source image f (x,y), how do we compute a transformed image g(x’,y’)=f (h(x,y))? • Spatial transformations • Affine, Projective transformation • Motion or disparity - compensation y Image Media Eng. 2004, by S.H. Lee

  21. Image warping (2) • A spatial transformation • Transformation : from coordinate to coordinate. • Reference : from value to value (intensity or color). • Mapping is usually defined as matrix form. 정수 좌표 실수 좌표 정수 좌표 Image Media Eng. 2004, by S.H. Lee

  22. h(x,y) y’ x x’ f(x,y) g(x’,y’) y Forward warping (1) • Send each pixel f (x,y) to its corresponding location (x’,y’)=h(x,y) in the second image • Splatting in computer graphics Image Media Eng. 2004, by S.H. Lee

  23. h(x,y) y y’ x x’ f(x,y) g(x’,y’) Forward warping (2) • Forward mapping • Integer input  real number output • Problems : hole, overlap hole overlap Image Media Eng. 2004, by S.H. Lee

  24. Forward warping (3) • Four-corner mapping • Consider input pixel as square • Squares  quadrilaterals • contiguous pixels  contiguous pixels (removal of hole and overlap) • Problems : costly intersection tests, magnification • Solutions : adaptive sampling of input image Image Media Eng. 2004, by S.H. Lee

  25. Forward warping (4) • Recent solutions • Distribution by continuous radial basis function • Super resolution • Subpixels are generated by the interpolation • Random resampling of input pixels Super resolution (16 partitions of a pixel) Radial basis modeling Image Media Eng. 2004, by S.H. Lee

  26. h-1(x,y) y y’ x x x’ f(x,y) g(x’,y’) Inverse warping (1) • Get each pixel g(x’,y’) from its corresponding location (x,y)=h-1(x’,y’) in the first image Image Media Eng. 2004, by S.H. Lee

  27. Inverse warping (2) • Inverse mapping • Integer number input  real number output. • Guarantees that all output pixels are computed. • Interpolation using surrounding pixels is needed. • Filtering is needed. Image Media Eng. 2004, by S.H. Lee

  28. h-1(x,y) y y’ x x x’ f(x,y) g(x’,y’) Inverse warping (3) • Q: What if pixel comes from “between” two pixels? • A: resampling color value • resampling techniques: nearest neighbor, bilinear, • Usually inverse warping is better • eliminates holes • requires an invertible warp function Image Media Eng. 2004, by S.H. Lee

More Related