Fish position determination
This presentation is the property of its rightful owner.
Sponsored Links
1 / 14

Martin Kozák Miroslav Hlaváč PowerPoint PPT Presentation


  • 94 Views
  • Uploaded on
  • Presentation posted in: General

Fish position determination in 3D space by stereo vision. Martin Kozák Miroslav Hlaváč. 27 . 07. 2011. Project goals. Design low budget system to determinate 3D position of fish in water environment in real time Explore the capabilities of two cameras system

Download Presentation

Martin Kozák Miroslav Hlaváč

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Martin koz k miroslav hlav

Fish position determination

in 3D space by stereo vision

Martin Kozák

Miroslav Hlaváč

27. 07. 2011


Project goals

Project goals

  • Design low budget system to determinate 3D position of fish in water environment in real time

  • Explore the capabilities of two cameras system

  • Explore the capabilities of the Kinect dept sensor

  • Testing of both systems in different conditions

  • Compare results from cameras and Kinect

  • Designed system will be used to track differences in fish motion


Used equipment and software

Used equipment and software

  • Aquarium (60x30x30cm)–similar one is planned to be used in real application of this project

  • Two Microsoft LifeCam Studio webcams

  • Calibration object (chessboard)

  • Kinect for Xbox 360

  • Rubber testing object

  • Matlab


Two cameras system

Two cameras system

  • The system of two cameras is emulating human eyes

  • We need to do calibration of cameras to determine the system parameters

  • These parameters are then used to compute 3D coordinates from two different views of scene (epipolar geometry)


Epipolar geometry

Epipolar geometry

  • We can determine position of point from one image, but to determine depth we need the information from the second camera

  • Selection of one point in left image and finding corresponding point on epipolar line in right image

  • Computing 3D coordinates from those two points


Cameras calibration

Cameras calibration

  • Two sets of parameters for cameras

    • Extrinsic (rotation and translation between cameras)

    • Intrinsic (focal length, skew and pixel distortion for each camera)


Kinect

Kinect

  • Gaming device for Xbox 360

  • Projecting IR light pattern on the scene through special grid

  • Computing depth information from the projected grid distortion


Cameras results 1

Cameras results 1

  • Manual corresponding points selection

  • Selecting the white point onrubber testing object manually and computing 3D trajectory

  • 3D coordinates accuracy is ± 0.5 mm


Camera results 2

Camera results 2

  • We developed online tracking system – 7fps

  • Automatic corresponding point selection

  • Image thresholding

  • Binary image opening to eliminate small distortions

  • By computing mean position of white pixels we will get corresponding points in both images


Kinect accuracy

Kinect accuracy

  • Real and Kinect distance dependence on water depth

  • Depth independent

  • Kinect accuracy in x-axis in water

  • x-axis accuracy is ±3.5pixels

measured distance [cm]

real distance [cm]

object size [pixel]

shift from the center of view [cm]


Kinect results

Kinect results

  • We developed online tracking system – 30 fps

  • Maximum measurable depth in clear water is 40 cm

  • Maximum measurable depth in dirty water is 20 cm

  • Depth of fish is obtained by depth thresholding

  • Minimal measurable distance 80cm


Kinect vs cameras

Kinect vs. cameras

Kinect

Cameras

Precision (+)

Environment independence (+)

Image segmentation(-)

Localization of (-)corresponding points

Calibration for each new (-) system position

Requires more processing (-) power

  • No need for calibration (+)

  • Depth map is direct output(+)

  • No color and outer light (+) dependence

  • Maximal water depth (-) limitation

  • IR reflecting material cause (-) errors in depth map

  • Lower accuracy in water(-)

  • Minimal distance 80cm(-)


Conclusion

Conclusion

  • Both systems are usable for online 3D fish position determination in water

  • We would recommend using Kinect in environment where accuracy is not the main concern the water is shallow and clean and where we need more mobility

  • Cameras offer higher accuracy and environment independence but they require more processing power (corresponding points detection) and initial calibration


Acknowledgement

Acknowledgement

We would like to thank Ing. PetrCísař, Ph.D. for leading us through this project and for his advices.


  • Login