1 / 15

Sensing the Visibility Range at Low Cost in the SAFESPOT Road Side Unit

Sensing the Visibility Range at Low Cost in the SAFESPOT Road Side Unit. Nicolas Hautière 1 , Jérémie Bossu 1 , Erwan Bigorgne 1 , Nicolas Hiblot 2 , Adberrahmane Boubezoul 1 , Benoit Lusetti 2 , Didier Aubert 2 1. LEPSiS, INRETS/LCPC, Univ. Paris-Est, France 2. LIVIC, INRETS/LCPC, France.

edison
Download Presentation

Sensing the Visibility Range at Low Cost in the SAFESPOT Road Side Unit

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sensing the Visibility Range at Low Cost in the SAFESPOT Road Side Unit Nicolas Hautière1, Jérémie Bossu1, Erwan Bigorgne1, Nicolas Hiblot2, Adberrahmane Boubezoul1, Benoit Lusetti2, Didier Aubert2 1. LEPSiS, INRETS/LCPC, Univ. Paris-Est, France 2. LIVIC, INRETS/LCPC, France

  2. Overview of the system • The proposed system is a data-chain which produces environmental information in the SF Local Dynamic Map based on the detection of meteorological events (rain, fog, black ice, wet road) by one or several sensors of the SAFESPOT Road Side Unit. • It refines these events, or may create a new event, by combining the outputs of the different sensors, in particular CCTV cameras. • By querying the status of vehicle actuators with respect to their past locations, the component is also able to extend or reduce the detection area of this environmental event. • The information is prone to be used in ‘Hazard &Incident Warning’ and ‘Speed Alert’ applications.

  3. LDM API Applications traffic event/accident, weather/road status, vehicle manoeuvres, etc The SAFESPOT Road Side Unit Data Fusion R/S sensing systems LDMServer Data Rec. TCP/IP Situation Refinement UDP SF vehicles Object Refinement Message Generation GPS Legacy systems VANET Router SP2 Framework Data sources Gateway ROADSIDE UNIT Legacy systems SF vehicles Responsibility SP2 SP3 SP5 SP4

  4. ! The Local Dynamic Map Real time map of vehicle surroundings with static and dynamic safety information Vehicles Road side unit Ego Vehicle Congestion Temporaryregional info Tree DaytimeFog Landmarks for referencing Accident Map fromprovider

  5. Road visibility • Based on the French standard NF-P-99-320 • The SF system shall detect visibilities below 400m • The SF system should assign the low visibilities to one of the four categories • The system should detect the origin of the visibility reduction: fog, hydrometeors

  6. Technology The sensing system aims to detect, classify critical weather conditions (dense fog, hard rain showers) and estimate the visibility range through use of classical CCTV cameras Camera used: DALSA Genie M-1400 Resolution 1392 x 1040 with sensor 1/2" Pixels : 4.65µm* 4.65µm - 15 im/s Detection software A background modeling approach, based on a mixture of Gaussians, is used to separate the foreground from the background. Since fog is steady weather, the background image is used to detect and quantify it. Since rain is a dynamic phenomenon, the foreground is used to detect. Compatible with existing video-surveillance solutions Data sources: CCTV for Visibility (1/3) – Overview

  7. Data sources: CCTV for Visibility (2/3) – Fog detection Fog detection+ Meteorological visibility estimation Vmet Original sequence Driving space area Mobilized visibility distance Vmob

  8. Data sources: CCTV for Visibility (3/3) – Hydrometeors detection Original sequence Segmentation Classification Detection

  9. Data fusion at RSU level: Fog presence identified by CCTV camera Confirmation by weather station Combination of different sensor outputs to compute a single visibility range descriptor At road network level: Visibility range is spatial barycenter of different sensors outputs The corresponding uncertainty is the sum of: The uncertainty of the sensors themselves The uncertainty coming from the distance to the data sources The uncertainty coming from the status of fog lamps on the road section θ Situation refinement of visibility range • Possible other data sources: Mobile fog sensor Visibility meter Fog lamps status

  10. Situation refinement of visibility range • Results on LCPC test track Meteorological visility map SAFESPOT camera  in-vehicle camera  Fog lights on  Fog lights off Uncertainty map

  11. Conclusion and perspectives • The performances of the detection modules are good, despite a lack of ground truth data. A more systematic evaluation should be carried out. • A general framework to fuse different visibility range related data sources has been proposed. • Fusion with low cost active sensors is planned. • Integration and the test of ‘Hazard & Incident Warning’ and ‘Speed Alert’ applications. H&IW and SpA applications CG22 test site

  12. Annex 1: Data fusion – At the RSU level • At the RSU level, fog presence is determined by the CCTV camera and may be confirmed or not by the weather station using physical constraints due to fog formation: • Assuming Gaussian variables, Vmet and Vmob are fused to obtain a single descriptor and determine the visibility range V • A simple linear KF is then used to compute a weighted iterative least-squares regression:

  13. Annex 1: Data fusion – At the road network level (1/2) • At a point of the road network, the visibility range depends on the surrounding data sources • Each data source has its own uncertainty due to its measurement principle, e.g.: • Since fog is local, the uncertainty is also strongly increasing with the distance:

  14. Corresponding uncertainty: A threshold * is used to filter uncertain data The visibility distance is thus expressed by the spatial barycenter of the different sensors outputs: Annex 1: Data fusion – At the road network level (2/2)

  15. Annex 2: References [1] M. Jokela, M. Kutila, J. Laitinen, F. Ahlers, N. Hautière, T. Schendzielorz. “Optical Road Monitoring of the Future Smart Roads – Preliminary Results”, International Journal of Computer and Information Science and Engineering, 1(4):240-245, 2007 [2] N. Hautière, E. Bigorgne and D. Aubert, “Daytime Visibility Range Monitoring through use of a Roadside Camera”, IEEE Intelligent Vehicles Symposium (IV’08), Eindhoven, The Netherlands, June 4-6, 2008. [3] N. Hautière, E. Bigorgne, J. Bossu and D. Aubert, “Meteorological conditions processing for vision-based traffic monitoring”, IEEE International Workshop on Visual Surveillance (VS2008), in conjunction with ECCV, Marseille, France, October 2008. [4] N. Hautière, J. Bossu, E. Bigorgne, A. Boubezoul, N. Hiblot, B. Lusetti, D. Aubert. “Sensing the visibility range at low cost in the SAFESPOT Road Side Unit”. Accepted in ITS World Congress (ITS’09), Stockholm, Sweden, September 2009. [5] N. Hautière, A. Boubezoul, Extensive Monitoring of Visibility Range through Roadside and In-Vehicle Sensors Combination, submitted to IEEE International Conference on Advanced Video and Signal-based Surveillance (AVSS’09), Genoa, Italy, October 2009 [6] J. Bossu, N. Hautière, J.-P. Tarel. Utilisation d’un modèle probabiliste d’orientation de segments pour détecter des hydrométéores dans des séquences vidéo, XXIIème colloque GRETSI (GRETSI’09), Dijon, France, Septembre 2009

More Related