1 / 27

Gesture Based Interface to Robots using Kinect and Matlab

Gesture Based Interface to Robots using Kinect and Matlab. Team: Katherine Coley: krcoley@buffalo.edu Smita Chutke: smitachutke@gmail.com Johanna Ingrid Carolina Hellstrom: johannai@buffalo.edu. *PowerPoint compiled by Katherine Coley.

Download Presentation

Gesture Based Interface to Robots using Kinect and Matlab

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gesture Based Interface to Robots using Kinect and Matlab Team: Katherine Coley: krcoley@buffalo.edu Smita Chutke: smitachutke@gmail.com Johanna Ingrid Carolina Hellstrom: johannai@buffalo.edu *PowerPoint compiled by Katherine Coley

  2. Website: https://sites.google.com/site/robonowtokinectinmatlab/

  3. Table of contents: 1. Introduction…………………………………………………….................................3 2. Background 2.1. Background of Lego Mindstorm NXT..………………….……......4 2.2. Background of Kinect Sensor..……..……………………….……….5 2.3. Background of MatLab............…...........................................6 2.4. Background of Simulink......................……………………….…...7 3. Materials………………………………………………….........................................8 4. Project Sytem Integration.......................................................................9 4.1. Robot Design: Model 1...………………………..………………........10 4.2. Robot Design: Model 2………………………………………………....11 4.3. Human Gesture Analysis......…….....................................12-14

  4. Table of Contents Cont. 4.4. Human-Robot Interface: Model 1...……………….....15-17 4.5. Human-Robot Interface: Model 2…………………….18-20 5. Kinect Accuracy.......................................................................21-23 6. Conclusion………………………………………………………......................24 8. References………………………………………………………………………....25

  5. 1.0. Introduction • The use of robotic systems has emerged in various fields over the years and has led to many researchers exploring the possibilities that robots have to offer. With the availability of inexpensive tools, such as Kinect from Microsoft, individuals and corporations alike have the ability to explore new avenues for robotic technology. Many potential applications have been recognized using human gesture control of robots in various fields including, but not limited to, business, education, and medical fields. This project uses these inexpensive tools to analyze the possibility of controlling the movements of robotic Lego Mindstorm NXT using human gestures and understand possible uses for similar technology for the future.

  6. 2.1. Background of Lego Mindstorm • User friendly kit containing software and hardware to build programmable robots [10] • Various models of kits [8] • Kits include an intelligent brick, sensors, motors, and various mechanical pieces [10][8] • Global community that shares designs, effective techniques, and hold various contests [8]

  7. 2.2. Background of Kinect Sensor • Code named project Natal and released in 2010 [1] • Advanced version of the Wii • Comprised of: Color VGA camera, depth sensor, and multi-array microphone [1] • Analyze a player’s body movement, recognize player’s face and voice, and can extrapolate blocked portion of a player’s skeleton [1] • Used/has the potential to be used in the: • Medical field [4] • Business field [2] • Education field [1]

  8. 2.3. Background of MatLab • Developed in the late 1970s [15] • Comprised of 5 main parts: matlab language, working environment, handle graphics, MatLab mathematical function library, and MatLab application program interface [15] • Dimensions free data element enables quick computation of problems [16] • Has a library of toolboxes [16] • Currently standard educational programming software used in colleges [16]

  9. 2.4. Background of Simulink • Programming language developed by Mathworks [6] • Provides an interactive, graphical environment for multi-domain simulation, modeling and analysis [5] • Predefined blocks as well as cusomizable blocks make it user friendly [7] • Two main categories: Blocks and Lines [7] • Blocks: sources, sinks, discrete, contiunuous, signal routing, and math operations [7] • Lines: transmit signals to other blocks [7]

  10. 3.0. Materials: • Camera that is able to connect to Lego Mindstorm NXT Robot, an interface with ability to process image in real time and give motor inputs. • Webcam or Kinect could be used for the camera • Possible interfaces include: 1. C++ with open CV, ROS 2. Player/Stage(Gazebo) 3. Microsoft RDS 4. Webots 5. V-REP 6. USARSim 7. MRSim (MatLab Toolbox) • This project utilizes a Microsoft Kinect Sensor to track skeletal joint coordinates in real time and then uses these coordinates in the MatLab software as an input to control the Lego Mindstorm NXT robot. The basic MatLab software is unequipped to perform this function therefore the Image Acquisition toolbox had to be downloaded.

  11. 4.0. Project System Integration Two different models of the Lego NXT robot were built to analyze and compare the different build styles and how well each work.

  12. 4.1. Robot Design: Model 1 • The first model created was designed with two treads and a motor on each side of the body that controls the corresponding tread .

  13. 4.2. Robot Design: Model 2 • The second model was designed with four wheels that replaced the treads and a steering mechanism similar to that of a car. The first motor is placed in the front of the Lego NXT robot and is rotated in order to turn the robot. The second motor is placed in the back of the body and sends power to the rear two wheels to move the car forwards.

  14. 4.3. Human Gesture Analysis • The robot is intended to respond to human gestures. The gestures the robot is meant to respond to are as follows: • Robot turns right when right arm is perpendicular to one’s body • Robot turns left when left arm is perpendicular to one’s body • Robot goes straight when both arms are down to one’s side • Robot varies forward speed based on one’s distance from Kinect sensor System determines arm and spinal location using matlab code paired with a kinect sensor

  15. 4.3. Human Gesture Analysis

  16. 4.4. Human-Robot Interface: Model 1 The distance of the spine (joints 1-3) from the Kinect sensor is used to control the speed of the Lego NXT. The higher the value of the motor input constant will increase the speed of the robot. To turn, the treads need to move at different speeds. Therefore the motor input for the tread corresponding to the direction turning will be reduced. The motor input value will be reduced by a value of ten regardless of the speed.

  17. 4.4. Human-Robot Interface: Model 1 To register whether the robot is to turn, the distance between the shoulder and hand joint is analyzed. If the distance between the left hand (joint 8) and the left shoulder (joint 5) is greater than 0.4 meters, then the Lego NXT will move left. Similarly, if the distance between the right hand (joint 12) and the right shoulder (joint 9) is greater than 0.4 meters, the Lego NXT will move right. These joint distances were chosen for an adult and if the program is to be used with children, the distance value would need to be decreased in order to accommodate a child’s smaller frame.

  18. 4.4. Human-Robot Interface: Model 1

  19. 4.5. Human-Robot Interface: Model 2 The distance of the spine (joints 1-3) from the Kinect sensor is used to control the speed of the Lego NXT just as the distance was used for the first model. The second Lego NXT robot is more complicated than the first model as the change in design no longer allows steering to be controlled by controlling which side moves faster than the other. To control the steering, the second motor was programmed to rotate in the clockwise direction to turn right and counterclockwise to turn left.

  20. 4.5. Human-Robot Interface: Model 2 To register whether the robot is to turn, the distance between the shoulder and hand joint is used as it was with model 1. If the distance between the left hand (joint 8) and the left shoulder (joint 5) is greater than 0.4 m, then the Lego NXT will move left. Similarly, if the distance between the right hand (joint 12) and the right shoulder (joint 9) is greater than 0.4 m, the Lego NXT will move right.

  21. 4.5. Human-Robot Interface: Model 2

  22. 5.0. Kinect Accuracy When studying gesture based interface using a Kinect Sensor and MatLab, something continually experienced was random or white noise within the system that resulted in skeletal image inaccuracy. This noise is not unexpected and is a common issue experienced when using skeletal tracking systems in practice [18]. In order to get more accurate and precise joint data, it is important to implement a noise reduction filter, also known as a smoothing filter, which results in smoother data by removing the unwanted noise. For this study, while acknowledging the random noise and resulting inaccuracy, it was decided that creating a filter to remove the random noise would be too in depth a project and therefore did not implement one in the models.

  23. 5.0. Kinect Accuracy • Different parameter will result in different characteristics and level of white noise. Examples of Different Parameters include [17] [18]: • IR light being scattered when hitting other objects • Shadows of objects that are close to the Kinect sensor • Room lighting • User’s body size, pose and distance from the Kinect • Location of the Kinect Sensor array • Quantization noise • Rounding effects that are introduced by computations

  24. 5.0. Kinect Accuracy • There are several different smoothing filters that are useful for projects such as this one [18]. Two filters that are especially useful are the median and jitter removal filters. Examples of smoothing filters include: • Auto regressive moving average (ARMA) filters • Simple averaging filter • Double moving averaging filter • Savitzky-Golay smoothing filter • Exponential smoothing filter • Double exponential smoothing filter • Adaptive double exponential smoothing filter • Taylor series filter • Median filter • Jitter removal filter

  25. 6.0. Conclusion Throughout this project, the possibility of controlling the movements of robotic Lego Mindstorm NXT using hand gestures was analyzed and uses for similar technology considered. In order to capture the hand gestures of the user, a Kinect sensor was used and the gestures then analyzed with a code written in MatLab and Simulink. After processing the gestures, MatLab would then send an output to the Lego Mindstorm NXT model via Bluetooth and act as the controller. Two vehicular shaped models were built for this project, the first model having treads and the second model having four wheels. The second model replaced the first model as the first model did not have sufficient traction with the floor and did not turn as well as desired due to the slipping experienced. The second model gripped the floor better than the first and turned more effectively than the first.

  26. 7.0. References [1] http://electronics.howstuffworks.com/microsoft-kinect.htm [2] http://www.bizjournals.com/seattle/blog/techflash/2011/08/boeing-taps-microsoft-kinect-to-sell-737.html?page=all [3] http://www.pcgamer.com/2012/01/13/kinect-on-pc-more-expensive-less-useful-still-exciting/ [4] http://research.microsoft.com/en-us/news/features/touchlesssurgery-060712.aspx [5] http://www.mathworks.com/products/simulink/ [6] http://en.wikipedia.org/wiki/Simulink [7] https://ewh.ieee.org/rl/ct/sps/PDF/MATLAB/chapter8.pdf [8] http://www.educationaltoyskidslove.com/review-history-lego-mindstorms-nxt.html [9] http://www.lego.com/en-us/mindstorms/gettingstarted/historypage/ [10] http://en.wikipedia.org/wiki/Lego_Mindstorms [11] http://www.robotshop.com/blog/en/lego-robotic-projects-3701 [12] http://www.pcmag.com/article2/0,2817,2423200,00.asp [13] http://www.zath.co.uk/lego-mindstorms-nxt-2-0-build-and-program-your-own-lego-robot/ [14] http://shop.lego.com/en-US/LEGO-MINDSTORMS-EV3-31313 [15] http://en.wikipedia.org/wiki/MATLAB [16] http://cimss.ssec.wisc.edu/wxwise/class/aos340/spr00/whatismatlab.htm [17] http://www.codeproject.com/Articles/317974/KinectDepthSmoothing [18] http://msdn.microsoft.com/en-us/library/jj131429.aspx [19] http://aegisacademy.com/community/internal-ballistics-part-ii-mechanical-precision/

More Related