1 / 5

Autonomous vehicle Sensors

This paper focuses on autonomous vehicles sensors. Sensors such as camera, radar, and lidar are the main components when it comes to designing autonomous vehicles. These sensors are important for the autonomous vehicles where it allow the system to measure the surrounding environment and to verify that what a car is detecting is accurate for further analysis to plan, control, and make decisions. This article will discuss three critical sensors that are a crucial part of the overall autonomous system which are Radar, Lidar, and Camera vision.

Mukhtar12
Download Presentation

Autonomous vehicle Sensors

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Autonomous Vehicle Sensors Mukhtar Oudeif, Saad Alshahrani and Hassan Abdallah University of Michigan – Dearborn Abstract-This paper focuses on autonomous vehicles sensors. Sensors such as camera, radar, and lidar are the main components when it comes to designing autonomous vehicles. These sensors are important for the autonomous vehicles where it allow the system to measure the surrounding environment and to verify that what a car is detecting is accurate for further analysis to plan, control, and make decisions. This article will discuss three critical sensors that are a crucial part of the overall autonomous system which are Radar, Lidar, and Camera vision. collect environmental data information. The sensor data is processed in the perception block, which consists of components that integrate sensor input into meaningful information. Then, the output from the perception part is used by the planning subsystem for behavior planning as well as short and long path planning. Control module follows the path laid out for the AV by the planning subsystem and transmits directives to it[5]. INTRODUCTION Autonomous Driving has six different levels of autonomy, according to the SAE. As indicated in Figure 1, it varies from cars that have zero Automation to those autonomous. Autonomous vehicles utilize a collection of softeware and sensors to visulaize the surrounding objects and maneuver without driver involvement in order to attain increased levels of autonomy. Cutting-edge sensor technologies are currently advancing at a quick pace to develope transportation to make it safer for riders and pedestrians. This has prompted more academics and engineers from a wide range of subjects and backgrounds to participate in the process and handle all of the related difficulties[4]. that are completely Figure 2 Block diagram of the autonomous vehicle system[5] RELATED WORK A.Radar The Authors showed how Polarimetric Radar sensor can provide detailed and useful information to identify object dimension, object orientation and motion predication. As it showed in the article the polarimetric radar data, the main velocity analysis, the micro doppler analysis and the street condition measurement are the focus of the Radar function. In this article, the author points out that there are two advantages of using the circular polarimetric. The first advantage is the high probability of recognizing the target. The second advantage is the ability to differentiate between various numbers of ray bounces within each scattering center of the target given by the copular and cross polar polarimetric channels. This is helped to distinguish between the vehicle, complex target, and the environment. The authors were able to plot these probabilities in a driving situation as well as in a parking maneuver. Figure 1 showed parking maneuver Figure 1 SAE automation levels[4] As indicated in Fig. 2, autonomous vehicle system is classified into four primary groups. Many various sensors put on the vehicle are used to sense the world. Those are hardware components that

  2. Page 2 of 5 doppler method for the analysis. The speed of the objects depended on the strongest signal’s amplitude within each single pixel that represent ranges and angles point in the radar images. In addition, the authors utilized the polarimetric technology to measure the road conditions by mounting a polarimetric radar on front the vehicle. The article showed a comparison between the properties of asphalt and gravel road data results, which proved that this a reliable method for identifying road conditions. Figure 4 showed image of polarimetric of asphalt road. Figure 3 parking maneuver Also, assigning speeds to every individual backscattering point gave a better data due to an enormous number of echoes detecting received by the complete surface of the vehicle. Figure 2 showed image of data for corresponding polarimetric radar Figure 6 Image of polarimetric of asphalt road In this article, the authors used AVL driving cube to perform the experiments. This testing required vehicle in the loop with simulating test scenario using dynamic model and stimulating torque to the wheel shaft as well as the steering. Figure 5 showed image of the driving cube for ADAS and ADF tests. Figure 4 Image of data for corresponding polarimetric radar Figure 3 showed image of data for velocity of objects Figure 5 Image of data for velocity of objects In this experiment, the vehicle included various speeds. The front portions of the vehicle which move away from the sensor showed lower speed, about 5km per hour. The back portions of the vehicle moved straight towards the sensor with speed of 12km per hour. The Authors used the micro Figure 7 Image of the driving cube for ADAS and ADF tests A feedback loop data was providing the control outputs of the vehicle, then is back into the simulation. The simulation of the radar targets was

  3. Page 3 of 5 ranges between 2 to 300 m to cover more than one realistic target. To cover the long ranges targets, the authors used Radar Target Simulators (RTS) based on the Digital RF Memory (DRFM). Additionally, authors designed highly scalable analogue radar target simulator to cover the close ranges targets. Radar simulated the target and then transmitted it to the radar sensor accordingly to the parameters of the simulated target. Figured 6 showed analogue RTS in the infrared regions released by a laser diode and to be collected by the receiver in the Lidar[4]. Figure 10 LiDAR system block diagram [4] Lidar measures the space between the object and the sensor using an infrared laser beam. The majority of contemporary lidars utilizes light of 900 nm wavelength, however wavelengths that has better performance in nature environment such as rain and fog. Throughout the field of vision in today's lidars, a spinning swivel scans the laser beam. The lasers sends pulses, and things reflect the pulses. These reflections provide a point cloud that is used to represent the objects. The higher the laser beam, the larger the amount of scan layers per layer, lidar has a far higher spatial resolution than radar[5].When it comes to detecting speed, LIDAR systems are often less precise than RADAR systems. When compared to RADAR, this is owing to the inability to use the Doppler Effect. As a result, most of the autonomus vehicles employ both LIDAR and RADAR sensors as shown in Google’s car Fig. 11[6]. some employ longer Figure 8 Analogue RTS Modules used in this experiment are fitted with micro controllers to control signals over the CAN bus with TEM to enable exchange of the messages. Each module was equipped with EE-PROM to store the calibration data.RF front was able to capture the radar signal, convert it to frequency of the RTS and to ensure adaption to various frequency bands. The RTS experiment results had shown that it’s capable of simulating two different moving targets with range from 5-30 m and RCS dynamic range of 50 dB, which are detected by the radar sensor. Figure 7 showed radar display with two independent simulated targets. Figure 11 Google's Autonomous Vehicle[6] Figure 9 Radar display with two independent simulated targets B.Lidar LiDAR stands for Light Detection and Ranging, a technique was developed during the 1970s for use on space and aerial platforms. LiDAR systems work similarly to RADARs in that they measure the amount of time that a light pulse takes C.Camera To provide a digital image of the surrounded area for the autonomous vehicle, the camera uses a passive light sensor which it can determine any

  4. Page 4 of 5 object in the road either if its moving or not. Therefore, by equipping the vehicle with the camera sensor from every angle, it would be able to obtain a 360° view of the exterior environment, giving the vehicle a more comprehensive image of traffic situations. Colors and textures are the main benefits of camera sensors nowadays comparing to any other type of sensors. This is an important benefit in terms of improving the autonomous vehicles system because the technology allows the car to recognize road signs, traffic lights and other objects. See figure 8. Figure 13 Camera for Autonomous Car The cameras can provide better data and digital image for image processing, also different camera sensors have varying capabilities. HD cameras, on the other hand, require more consumption of power to achieve this, resulting in larger files. According to the author, this method would provide better visual data for autonomous vehicles, but there is a drawback in terms of the amount of power required to achieve such outcomes. CONCLUSION Figure 12 Camera for Autonomous Car Autonomous vehicle development is the focus in automotive industry. With the utilization of sensors such as Radar, Lidar, and Camera autonomous vehicle keep on improving as they go through the testing. Improving these components will allows the automotive business to achieve its goal and provide a product that are reliable and safe. To make the best decisions in real time, the main important thing of the system is to require as much information as possible about its surrounded area for the autonomous vehicle including all the objects that present on the road. Moreover, the major challenge is to create a highly accurate system that can minimize the potential errors and to retrieve all the data from the sensor. Cameras can also calculate the distance to a specific object, although this necessitates the use of complicated processing techniques as you see in figure 9. In addition to these benefits the low cost and high availability for this technology is a huge benefit to the autonomous vehicles and its sill less expensive compared to other sensors like Lidar systems. The main disadvantage of the camera sensor is that it’s very sensitive sometimes to intensity light because of the poor weather when it rains or snowing it can be affected to see any objects in the roadway. Furthermore, there are times when the images aren’t clearly enough for a computer to decide on what the vehicle should do. The driving algorithms may fail in instances when the colors of objects match the background. REFERENCES [1] Stefan Trummer, Gerhard F. Hamberger, Richard Koerber, Uwe Siart, F.Eibert."Autonomous Driving Features based on 79 GHz Polarimetric Radar Data." IEEE (2018). Web. [2] Helmut Schreiber, Michael Gadringer, Andreas Gruber, Wolfgang Bosch,Dominik Amschl, Horst Pflugl, and Steffen Metzner. " Highly Scalable Radar Target Simulator for Autonomous Driving Test Beds." IEEE (2017). Web. and Thomas

  5. Page 5 of 5 [3] S. Campbell et al., "Sensor Technology in Autonomous Vehicles : A review," 2018 29th Irish Signals and Systems Conference (ISSC), 2018, pp. 1-4, doi: 10.1109/ISSC.2018.8585340. [4] Vargas, J., Alsweiss, S., Toker, O., Razdan, R., & Santos, J. (2021). An overview of autonomous vehicles sensors and their vulnerability to weather conditions. Sensors, 21(16), 5397. [5]Kocić, J., Jovičić, N., & Drndarević, V. (2018, November). Sensors autonomous vehicles. Telecommunications Forum (TELFOR) (pp. 420- 425). IEEE. [6] Varghese, J. Z., & Boone, R. G. (2015, September). Overview of autonomous vehicle sensors and systems. In International Conference on Operations Excellence and Service Engineering (pp. 178-191). Sn. [7] Li, Y., & Ibanez-Guzman, J. (2020). Lidar for autonomous driving: The principles, challenges, and trends for automotive systems. IEEE Signal Processing Magazine, 37(4), 50-61. and sensor In 2018 fusion in 26th lidar and perception

More Related