1 / 14

Robotics Chapter 6 – Visual Servoing

Robotics Chapter 6 – Visual Servoing. Dr. Amit Goradia. Topics. Introduction – 2 hrs Coordinate transformations – 6 hrs Forward Kinematics - 6 hrs Inverse Kinematics - 6 hrs Velocity Kinematics - 2 hrs Trajectory Planning - 6 hrs Robot Dynamics (Introduction) - 2 hrs

gamada
Download Presentation

Robotics Chapter 6 – Visual Servoing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. RoboticsChapter 6 – Visual Servoing Dr. Amit Goradia

  2. Topics • Introduction – 2 hrs • Coordinate transformations – 6 hrs • Forward Kinematics - 6 hrs • Inverse Kinematics - 6 hrs • Velocity Kinematics - 2 hrs • Trajectory Planning - 6 hrs • Robot Dynamics (Introduction) - 2 hrs • Force Control (Introduction) - 1 hrs • Task Planning - 6 hrs • Machine Vision - 6 hrs

  3. Visual Servoing • Control the movement of a robot using information collected from cameras. • Operates in closed loop. • Provides better accuracy than look and move systems

  4. Camera Configurations End-Effector Mounted Fixed

  5. Servoing Architectures • Position based. • Coordinates extracted from the image • Control law derived from extracted coordinates

  6. Servoing Architectures • Image based • Positions are not extracted from image • Control directly applied in image space

  7. Position based Alignment in target coordinate system The 3D structure of the target is rconstructed The end-effector is tracked Sensitive to calibration errors Sensitive to reconstruction errors Position Based End-effector target

  8. Image Based Alignment in image coordinates No explicit reconstruction necessary Insensitive to calibration errors Only special problems solvable Depends on initial pose Depends on selected features Image Based Image of end effector Image of target

  9. EOL and ECL Configurations • EOL: endpoint open-loop; • only the target is observed by the camera • ECL: endpoint closed-loop; • target as well as end-effector are observed by the camera EOL ECL

  10. Position Based Algorithm • Position based • Estimation of relative pose • Computation of error between current pose and target pose • Movement of robot p1 p2

  11. p1m p2m d Position Based Point Alignment • Goal: Bring e to 0 by moving p1 • pxm is subject to the following measurement errors: sensor position, sensor calibration, sensor measurement error • pxm is independent of the following errors: end effector position, target position • e = |p2m – p1m| • u = k*(p2m – p1m)

  12. p1 p2 u1 v1 v2 u2 d1 d2 c1 c2 Image based Point Alignment Goal: Bring e to 0 by moving p1 • uxm, vxm is subject only to sensor measurement error • uxm, vxm is independent of the following measurement errors: sensor position, end effector position, sensor calibration, target position e = |u1m – v1m| + |u2m – v2m|

  13. Image Jacobian • f is the feature point in image space • r is the real-world position • Image Jacobian relates the rates of change from f to r • Image Jacobian for perspective projection model

  14. Examples – Image Space Servoing

More Related