1 / 30

Real time H.264 decoder implementation in robot control

Real time H.264 decoder implementation in robot control. Saurabh Ghorpade. H.264 decoder [ 1]. algorithms for evaluating the inverse transforms and packet analysis high speed real time implementation in C/C++ for robot control feature recognition to be implemented.

Download Presentation

Real time H.264 decoder implementation in robot control

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Real time H.264 decoder implementation in robot control SaurabhGhorpade

  2. H.264 decoder [1] • algorithms for evaluating the inverse transforms and packet analysis • high speed real time implementation in C/C++ for robot control • feature recognition to be implemented

  3. Existing functionality • Java implementation- successful in terms of functionality but slow speed. • Reasons[2]: • Class loading at startup • Unnecessary array validations • Excessive use of heap • Garbage collection

  4. C for real time control • No Java Virtual Machine so direct interaction with hardware • Better memory management using pointers • Compiler optimizations, macros can be used. • Easy and fast control system, navigation and socket.

  5. Environment setup • Robot communication protocol [3], router. • Operating system: Linux • C Compiler.

  6. Implementation • Multithreaded socket programming • To send commands to robot • To decode the video packet

  7. Image decoding • Image divided into groups of blocks (GOB) and each GOB is further divided into macroblocks • Each macroblock contains 16x16 image in Y CbCrformat, type 4:2:0

  8. Steps • Entropy decoding • Inverse zigzag, followed by inverse quantization and inverse transformation. • Forming the picture by the picture format given in fig 6. • Extracting the motion vector information from the packet followed by motion compensation. • Finally getting the video.

  9. (Fig 1) Client server [3]

  10. (Fig 2) packet [3]

  11. (Fig 3) Modified JPEG [3]

  12. H.264 Encoder (a) and decoder (b) [15]

  13. (Fig 4) Motion vector estimation [3]

  14. (Fig 5) 16x16 intra prediction modes [3]

  15. (Fig 6) Picture format [3]

  16. (Fig 7) Hardware [3]

  17. Hardware description [3] • A.R. (augmented reality) Drone: Quadrotor. Batteries: • The AR.Drone uses a charged 1Ah, 11.1V LiPo batteries to fly Motion sensors • ultrasound telemeter • camera aiming downwards Video streaming: The frontal camera is a CMOS sensor with a 90 degrees angle lens.

  18. Algorithm • Initialize the UDP sockets for streaming, sending commands and accepting navigation data. • Create threads for streaming video, sending commands, monitoring the keyboard and accepting the navigation data from the AR.Drone. • Create an infinite loop. • Poll the keyboard event.

  19. Algorithm • Update the image on the screen. • If the start key is pressed set the robot into hover mode by sending the hover command to it. • The robot now waits for the red ball to show up in front of its frontal camera. • If ball is not there, the robot stays still in the same position until the stop key is pressed. • If the robot’s camera gets the ball, it uses open source computer vision (OpenCV[18]) to recognize the ball. At the same time the data coming to the laptop through the video socket is analyzed. The inverse transform is calculated followed by motion estimation and compensation. This video is shown up on the screen.

  20. Algorithm cont.. • The command is sent to move the robot to the left side. • Finally, once the stop key is pressed, the robot is brought down by sending the command.

  21. Algorithm cont.. • Then the centroid of the ball is calculated. • As the robot moves, the difference between the updated position and the previous position is calculated (The movement is restricted to the horizontal direction). • If the difference is positive, it is inferred that the ball has moved to the right. • So the command is sent to move the robot to the right side with the intention to reduce the difference. • If the difference is negative, it is inferred that the ball has moved to the left.

  22. A simple program to establish connection with the A.R. Drone: static void send_command() { snprintf(str,AT_BUFFER_SIZE,AT*PCMD=%d,%d,%d,%d,%d,%d\r",nb_sequence++,1,*(int*)&radiogp_cmd.pitch,*(int*)&radiogp_cmd.roll,*(int*)&radiogp_cmd.gaz,*(int*)&radiogp_cmd.yaw); at_write((int8_t*)str, strlen (str)); } void at_write (int8_t *buffer, int32_t len) { structsockaddr_in to; int32_t flags; if( at_udp_socket < 0 ) { at_udp_socket = socket( AF_INET, SOCK_DGRAM, 0 ); }` if( at_udp_socket >= 0 ) { int res; memset( (char*)&to, 0, sizeof(to) ); to.sin_family = AF_INET; to.sin_addr.s_addr = inet_addr(WIFI_MYKONOS_IP); to.sin_port = htons (AT_PORT); res = sendto( at_udp_socket, (char*)buffer, len, 0, (structsockaddr*)&to, sizeof(to) ); buffer[strlen((char*)buffer)-1] = '\n'; } }

  23. Program explanation • sendCommand() is the first function being called from main(). • It is used to fill the buffer containing the command to be sent to robot. • atWrite() function will send the command to the robot over WiFi [8].

  24. Program explanation • A datagram socket facilitates the data to be sent using the UDP protocol. • atWrite() initializes the socket with IP address and port number. • Finally the sendTo() function is called which takes the buffer ( containing command) and socket and sends it to robot over WiFi [8]. • This function is defined from the linuxnetworking libraries.

  25. Further work • The video stream is available on the port number 5555. So, the current research is focused on the incoming bit stream analysis. • Prior to this, the application needs to be multithreaded in order to simultaneously handle streaming, navigation, control and monitoring keyboard. • The keyboard monitoring is facilitated by simple direct media layer (SDL [17]) libraries. • Finally the object recognition is done using OpenCV [18] library.

  26. Applications [3]: • Reality games: A.R. Drone can be controlled by joystick or smart phones to play video games. • Advertising: A.R. Drone can be used for online advertising.  • Medical and 3D vision: Photos of patients from various angles can be captured by the frontal as well as bottom cameras.

  27. References • [1] F. Pescador, M.J.Garrido, C.Sanz, E.Juarez, M.C.Rodriguez and D.Samper, “A real-time H.264 MP decoder based on a DM642 DSP”,14th IEEE International Conference on Electronics, Circuits and Systems, Madrid, Spain, ICECS 2007, Vol. 11, pp.1248 – 1251, Dec. 2007. • [2] Java is slower than C: http://www.jelovic.com/articles/why_Java_is_slow.htm • [3] A.R. Drone Developer’s guide: https://projects.ardrone.org/login?back_url=http%253A%252F%252Fprojects.ardrone.org%252Fattachments%252Fdownload%252F365%252FARDrone_SDK_1_7_Developer_Guide.pdf • [4]A.R.Drone • http://ardrone.parrot.com/parrot-ar-drone/usa/ • [5] Y-Cb-Cr format: http://en.wikipedia.org/wiki/YCbCr • [6] H.264 Reference: http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC • [7] W.T.Staehler and A.A.Susin, “Real-time 4x4 intraframe prediction architecture for a H.264 decoder”, UFRGS, Alegre Telecommunications Symposium, 2006 International, pp. 416 – 421, Sept. 2006  

  28. References • [8]Wifi wiki : http://en.wikipedia.org/wiki/Wi-Fi • [9] GCC wiki : http://en.wikipedia.org/wiki/GNU_Compiler_Collection • [10] Introduction to make (linux make for building C/C++ sources) http://linuxdevcenter.com/pub/a/linux/2002/01/31/make_intro.html • [11] JPEG wiki: http://en.wikipedia.org/wiki/JPEG • [12] Shih-Tse Wei, Chia-Wei Tien, Bin-Da Liu and Jar-Ferr Yang, “Adaptive truncation algorithm for Hadamard-transformed H.264/AVC lossless video coding”. IEEE Transactions on Circuits and Systems for Video Technology, Vol. 21, pp. 538 - 549, May 2011. • [13] Runlength coding wiki: http://en.wikipedia.org/wiki/Run-length_encoding • [14] Huffman coding wiki: http://en.wikipedia.org/wiki/Huffman_coding • [15] Soon-kak Kwon, A. Tamhankarand K.R. Rao, “ Emerging H.264/AVC video coding standard”, J. Visual Communication and Image Representation, vol. 17, pp.186-216, April 2006. • http://www-ee.uta.edu/dip/Courses/EE5351/ee5351.htm-H.264 review

  29. References • [16] POSIX Thread wiki: http://en.wikipedia.org/wiki/POSIX • [17]SDL wiki: http://en.wikipedia.org/wiki/Simple_DirectMedia_Layer • [18]OpenCV wiki: http://en.wikipedia.org/wiki/OpenCV • [19]DCT wiki: http://en.wikipedia.org/wiki/Discrete_cosine_transform • [20] I.E. Richardson, White Paper: A Technical Introduction to H.264 / AVC.

  30. Thank you! • Any questions?

More Related