1 / 41

Rover Navigation and Visual Odometry : A New Framework for Exploration Activities

Anchorage, Alaska, 3 May 2010. ICRA Planetary Rover Workshop. Rover Navigation and Visual Odometry : A New Framework for Exploration Activities. Enrica Zereik , Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino.

thi
Download Presentation

Rover Navigation and Visual Odometry : A New Framework for Exploration Activities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Rover Navigation and VisualOdometry: A New FrameworkforExplorationActivities Enrica Zereik, Enrico Simetti, Alessandro Sperindé, Sandro Torelli, Fabio Frassinelli, Davide Ducco and Giuseppe Casalino GRAAL Lab, DIST, Universityof Genoa

  2. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Why a Framework? • to deal with the underlyingspecific hardware platform • to solve problemsrelatedtoreal-timeconstraintsofcontrolsystems • toprovidedata-unawarecommunicationmechanisms • tobereusedfordifferentcontrolsystems in severalapplications Develop a software architecturetoletresearchers focus theirattention on the controlalgorithmonly, withoutcaringabout the underlyingphysical system

  3. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop MainObjectives • Independencyofeachcontrolalgorithmfrom the underlying software platform • Minimizationof the numberof code linesnotstrictlyrelatedto the controlalgorithm • Capabilityofcoordinationbetween remote frameworks • Standard communicationmechanismbetweencontroltasks (minimum impact on the algorithm)

  4. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop AbstractionLevels • Independencyofeachcontrolalgorithmfrom the underlying software platform • Minimizationof the numberof code linesnotstrictlyrelatedto the controlalgorithm • Capabilityofcoordinationbetween remote frameworks KAL • Standard communicationmechanismbetweencontroltasks (minimum impact on the algorithm)

  5. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop AbstractionLevels • Independencyofeachcontrolalgorithmfrom the underlying software platform • Minimizationof the numberof code linesnotstrictlyrelatedto the controlalgorithm • Capabilityofcoordinationbetween remote frameworks WF • Standard communicationmechanismbetweencontroltasks (minimum impact on the algorithm)

  6. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop AbstractionLevels BBS • Independencyofeachcontrolalgorithmfrom the underlying software platform • Minimizationof the numberof code linesnotstrictlyrelatedto the controlalgorithm • Capabilityofcoordinationbetween remote frameworks • Standard communicationmechanismbetweencontroltasks (minimum impact on the algorithm)

  7. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop KAL: KernelAbstractionLayer • WorkFrameName Server: abstractionof the OS resources and services

  8. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop WF: WorkFrame • System Manager: resourcerequesthandling • Sched: RelSched can synchronizeframeworks • Logger: communicationtowarduser

  9. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop BBS: BlackBoard System • Inter-task communication • Resourceaccess • Bothlocal and remote tasks • SharedBlackBoardpublishing data • Localexecutionofcomputationinvolving BB data

  10. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop FrameworkHierarchy Resources, Scheduling Device I/O MutuallyExclusiveInterprocess Data Sharing (alsowith remote tasks) 1 2 3 4 2 3 4 1 Network Communication C++ MathRoutines

  11. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop VisualOdometryModule • FeatureExtraction: in eachimageof the stereo pair • Stereo Matching: correspondenceresearch • Triangulation: correspondent 3D pointcomputation • Tracking in Time: tracking the samefeatures in the followingimageacquisition • MotionEstimation: estimationof the motionoccuredbetween the twoconsidered stereo imagepairs

  12. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop VisualOdometryModule • FeatureExtraction: in eachimageof the stereo pair LOG filtering + SURF (robust descriptors ) • Stereo Matching: correspondenceresearch • Triangulation: correspondent 3D pointcomputation • Tracking in Time: tracking the samefeatures in the followingimageacquisition • MotionEstimation: estimationof the motionoccuredbetween the twoconsidered stereo imagepairs

  13. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop VisualOdometryModule • FeatureExtraction: in eachimageof the stereo pair • Stereo Matching: correspondenceresearch • Triangulation: correspondent 3D pointcomputation Epipolar constraint, descriptor-based • Tracking in Time: tracking the samefeatures in the followingimageacquisition • MotionEstimation: estimationof the motionoccuredbetween the twoconsidered stereo imagepairs

  14. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop VisualOdometryModule • FeatureExtraction: in eachimageof the stereo pair • Stereo Matching: correspondenceresearch • Triangulation: correspondent 3D pointcomputation • Tracking in Time: tracking the samefeatures in the followingimageacquisition Subject to erros, outliers rejected • MotionEstimation: estimationof the motionoccuredbetween the twoconsidered stereo imagepairs

  15. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop VisualOdometryModule • FeatureExtraction: in eachimageof the stereo pair • Stereo Matching: correspondenceresearch No external estimation, descriptor-based • Triangulation: correspondent 3D pointcomputation • Tracking in Time: tracking the samefeatures in the followingimageacquisition • MotionEstimation: estimationof the motionoccuredbetween the twoconsidered stereo imagepairs

  16. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop VisualOdometryModule • FeatureExtraction: in eachimageof the stereo pair • Stereo Matching: correspondenceresearch • Triangulation: correspondent 3D pointcomputation Least Square (outlier rejection, initial estimation) + Maximum Likelihood Estimation • Tracking in Time: tracking the samefeatures in the followingimageacquisition • MotionEstimation: estimationof the motionoccuredbetween the twoconsidered stereo imagepairs

  17. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop ExperimentalSetup • Custom mobile platform @ GRAAL • Tricycle-like structure • Bumblebee2 stereo camera system

  18. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop PreliminaryResults

  19. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop RoboticCrewAssistantforExplorationMissions: Vision, ForceControl and CoordinationStrategies Enrica Zereik, Andrea Sorbara, Andrea Merlo, Frederic Didot and Giuseppe Casalino GRAAL Lab, DIST, Universityof Genoa EuropeanSpaceAgency Thales Alenia Space, Italy

  20. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop EurobotWetModel

  21. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Eurobot Ground Prototype JR3 force/torque sensor pan/tilt stereo camerasfor rovernavigation exchangeableend-effector 7 d.o.f.arms, one camera on each armcameras four-wheeledroverforautonomousnavigation pan/tilt stereoscopic head formanipulation

  22. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop EGP - ControlAspects • Coordination • Coordination Rover and Arms • DynamicProgramming-basedstrategy • Vision • Objectrecognition and centering • ARToolKiTPlus and OpenCVsupport • Force • Approaching and actualgrasping • Contact detection

  23. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop GeneralControlArchitecturewithPriorityTasks • DynamicProgramming-based • Coordinationofroboticmacro-structures • Independentfrom the specific system configuration • Manydifferentcontrolobjectives can berequired Velocity control task requirement Associated cost-to-go Moving platform velocity

  24. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop GeneralControlArchitecturewithPriorityTasks Task i-th:

  25. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop BackwardPhase Use of relationships Monitoring MM tendency toward

  26. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop ForwardPhase • Remarks • The riskofMMlossesstillexists • (e.g. if the objectmustbevery high lifted) • If a MM loss isdetected the last • resortsolutionismodulating • ImplicitPriorityChange

  27. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop ImplicitPriorityChange Backward phase at platform level

  28. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Vision-basedRecognitionofObjects • Marker-basedobjecttracking • Reliability • Robustness • Occurringproblems • Lightingconditions • Complexityof the captured scene • Distancefromwhich the markerisseen Preliminary Thresholding Image Zooming Image Cleaning

  29. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Imagezooming Autothreshold Imagecleaning Image fromcamera LPF Pose Estimator To Estimator Pose estimation To E-GNC Image Processing Chain

  30. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop ImplicitPriorityChange angularerror angularerrorafterzooming angularerrorafter LPF linearerror linearerrorafterzooming linearerrorafter LPF

  31. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Force-basedApproachtowardsObjects • DirectForceControlStrategy • Detect a contactwith the objecttobegrasped • Compensate residualerrors • Pure ForceOnly at the Palm Level • feltby the JR3 sensor • the contactpointmustbelongto the palm • surface known and constant

  32. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Force-basedApproachtowardsObjects • Velocity Generation • Contactpointestimation • Velocityassignedto the estimatedcontactpoint • Computevelocityreferencewithrespectto the robot end-effector • Remarks • Noisysensor and too long distancefrompalm • Initialerrorverysmallthanksto vision with

  33. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop SimulativeResults

  34. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop ExperimentalResults • Video • 3. EGP Failed Equipment Replacement.avi

  35. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Conclusions and Future Work • EGP • Effectiveand autonomousroboticcrewassistant • Markerremoval • Potentially, flight model • PlanetaryRovers • VisualOdometryerrorlessthan 1% • 3D reconstructionof the environment • DEM construction and autonomousnavigation

  36. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop References, I EGP [1] T. Kröger, D. Kubus and F. M. Wahl, “6D Force and Acceleration SensorFusionforCompliantManipulationControl”, IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, October 2006. [2] B. J. Waibel and H. Kazerooni, “Theory and Experiments on the Stability of Robot Compliance Control”, IEEE Transactions on Robotics and Automation, February 1991, vol. 7, no. 1, pp. 95-104. [3] G. Bradski and A. Kaehler, “Learning OpenCV: Computer Vision with the OpenCVLibrary”, O'Reilly. [4] C. P. Lu, G. D. Hager and E. Mjolsness “Fast and Globally Convergent Pose EstimationFrom Video Images”, IEEE Transactions on Pattern Analysis and Machine Intelligence, June 2000, vol. 22, no. 6, pp. 610-622.

  37. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop References, II [5] B. Kainz and M. Streit, “How to Write an Application with Studierstube 4.0”, Technical report, Graz University of Technology, 2006. [6] J. Cai, “Seminar Report: Augmented Reality: the StudierstubeProject”, Seminar report. [7] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “AutonomousDual-Arm Mobile Manipulator Crew Assistant for Surface Operations: Force/Vision-Guided Grasping”, International Conference on Recent Advances in Space Technologies, Istanbul, Turkey, June 2009. [8] E. Zereik, A. Sorbara, G. Casalino and F. Didot, “Force/Vision-Guided Grasping for an Autonomous Dual-Arm Mobile Manipulator Crew Assistant for Space Exploration Missions”, International Conference on Automation Robotics and Control Systems, Orlando, USA, July 2009.

  38. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop References, III [9] G. Casalino and A. Turetta, “Coordination and Control of MultiarmNonholonomic Mobile Manipulators", MISTRAL: Methodologies and Integration of Subsystems and Technologies for Robotic Architectures and Locomotion, B. Siciliano, G. Casalino, A. De Luca, C. Melchiorri, Springer Tracts in Advanced Robotics, Springer-Verlag, April 2004. [10] G. Casalino, A. Turetta and A. Sorbara, “DynamicProgrammingbased Computationally Distributed Kinematic Inversion Technique”, Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November 2006. [11] G. Casalino, A. Turetta and A. Sorbara, “DP-BasedDistributedKinematicInversion for Complex Robotic Systems”, 7th Portuguese Conference on AutomaticControl, Lisbon, Portugal, September 2006. [12] E. Zereik, “SpaceRoboticsSupportingExplorationMissions: Vision, ForceControl and CoordinationStrategies”, Ph.D. Thesis, University of Genova, 2010.

  39. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop References, IV VisualOdometry [1] M. Maurette and E. Baumgartner, “AutonomousNavigationAbility: FIDO Test Results”, 6th ESA Workshop on Advanced Space Technologies for Robotics and Automation, Noordwijk, The Netherlands, November 2000. [2] M. Maimone, Y. Cheng, and L. H. Matthies, “Two Years of Visual Odometry on the Mars Exploration Rovers”, Journal of Field Robotics, March 2007, vol. 24, no. 3, pp. 169-186. [3] A. E. Johnson, S. B. Goldberg, Y. Cheng and L. H. Matthies, “Robust and Efficient Stereo Feature Tracking for Visual Odometry”, IEEE International Conference on Robotics and Automation, Pasadena, USA, May 2008. [4] L. Matthies, “Dynamic Stereo Vision”, Ph.D.Thesis, Carnegie Mellon University.

  40. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop References, V [5] D. Nistér, O. Naroditsky and J. Bergen, “Visual Odometry”, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, USA, June 2004. [6] Richard Hartley and Andrew Zisserman, “Multiple View Geometry in Computer Vision”, Cambridge University Press, March 2004. [7] E. Trucco and A. Verri, “Introductory Techniques for 3-D Computer Vision”, Prentice Hall, 1998. [8] I. J. Cox, S. L. Hingorani, S. B. Rao and B. M. Maggs, “A Maximum Likelihood Stereo Algorithm”, Journal of Computer Vision and Image Understanding, 1996, vol. 63, no. 3, pp. 542-567. [9] M. Fischler and R. Bolles, “Random Sample Consensus: a Paradigmfor Model Fitting with Application to Image Analysis and Automated Cartography”, Communications of the Association for Computing Machinery, June 1981, vol. 24, pp. 381-395.

  41. Anchorage, Alaska, 3 May 2010 ICRA Planetary Rover Workshop Thankyouforyourkindattention!!

More Related