1 / 24

Travel Time and Reliability: Is Data Quality a Showstopper? The Georgia Navigator Experience

Travel Time and Reliability: Is Data Quality a Showstopper? The Georgia Navigator Experience. Angshuman Guin URS Corporation angshuman_guin@urscorp.com ITSA Phoenix May, 2005. Overview. NaviGAtor Travel Times Data Failure Issues Remedial Measures Transportation Sensor System (TSS)

Download Presentation

Travel Time and Reliability: Is Data Quality a Showstopper? The Georgia Navigator Experience

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Travel Time and Reliability: Is Data Quality a Showstopper?The Georgia Navigator Experience Angshuman Guin URS Corporation angshuman_guin@urscorp.com ITSA Phoenix May, 2005

  2. Overview • NaviGAtor Travel Times • Data Failure Issues • Remedial Measures • Transportation Sensor System (TSS) • Maintenance Management Plan (MMP) • Key Stations • VDS Quality Assurance

  3. Georgia NaviGAtor’s Video Detection System (VDS) • 1,361 Fixed Black & White Cameras • Spaced Every 1/3 mile on Freeways • Continuous Speed / Volume/ Occupancy Data • Generates Travel Times for CMS

  4. Navigator ATMS Archive • Data Attributes • Volume / Count • Average Speed (per 15 minutes per lane) • Lane Occupancy (per 15 minutes per lane) • Frequency: 20 second data aggregated to 15 minutes data in archive • Per Lane • Bi-directional • Mainline / Ramps

  5. Travel Times on Website

  6. Direction of Travel DMS Subzone 1-1 Subzone 1-2 ZONE 1 Subzone 2-1 x Subzone 2-2 ZONE 2 Destination Travel Time Determination Dynamic Message Sign Trip section is comprised of 2 Zones Each Zone is comprised of 2 Sub-Zones Each Sub-Zone is comprised of several Stations

  7. Travel Time Determination cont. Dynamic Message Signs cont. 20 Second Station average speeds are aggregated into 1 minute Station, Sub-Zone and Zone average speeds 1 Minute Zone average speeds are categorized as: Moving Very Well (55 + mph) Moving Well (40 – 55 mph) Moving Slowly (30 – 40 mph) Moving Very Slowly (< 30 mph)

  8. Travel Time Determination cont. Dynamic Message Signs cont. 16 Travel Time messages are created in the message library for each DMS and displayed as traffic conditions change according to the matrix below Message: Travel Time = 5 - 6 min Message: Travel Time = 10 - 11 min Message: Travel Time = 16+ min

  9. Travel Times as a Performance Measure

  10. Travel Time Variability

  11. Data Failure Issues

  12. Existing Archived Data Flow

  13. Remedial Measures • Data Archive • Architecture (Transportation Sensor System – TSS) • Data sample • Data Collection (Maintenance Management Plan) • Hardware • Software

  14. - Server Administration - Communication TSS Process

  15. New Navigator Archive Transportation Sensor System (TSS) Attributes • Detector ID (integer) • StartTime (datetime) • Duration (integer, seconds) • Total-volume (integer, 0+) • Percent-trucks(float, 0.00 - 1.00) • Lane Occupancy (float, 0.00 - 1.00) • Average Speed (float, 0.0+) • Std. Dev. Lane Occupancy (float, 0.00 - 1.00) • Std. Dev. Average Speed (float, 0.0+) • VALIDITY(integer 1 - 100) • Percentage of valid samples in 5-minute aggregate • AVAILABILITY(integer 1 -100) • Percentage of available samples in 5-minute aggregate • XML Format • 5-Minute intervals instead of 15-Minute • Truck percentages • Filtering of data to eliminate bogus data • Meta-data information

  16. Data Quality Measures: Availability and Validity  where: nuseable values = the number of data samples with values present in the aggregate ntotal expected = the total number of data samples expected for the aggregate where: nvalid = the number of data samples with values meeting the validity criteria ntotal expected = the total number of data samples expected for the aggregate

  17. 5 6 7 Volume 3 3 2 4 Lane Occupancy ----> Validity Criteria • 1 • 2 • 3 • 4 • 5 • 6 • 7

  18. Maintenance Management Plan Hardware Failures Quality Assurance

  19. Maintenance Management Plan • Data calibration and validation with Ground Truth data • 1361 VDS Stations (5000+ detectors) • Key Stations (20+) • Problematic Detector Indicator Methodology

  20. Key Stations & Priority Stations

  21. Problematic Detector Indicator Methodology where: Ev : expected value for the station volume, Vk : volume of the key station associated with this station, KTOD DOW f : key-station adjustment factor for TOD and DOW, ε : expected error tolerance

  22. Maintenance Management Plan • Key Stations (20+) • Maintenance not only for failure but also for data quality • Data calibration and validation with Ground Truth data • Formalized procedure • Defined sample requirements • Use Hypothesis Testing (paired-t) • Obtain accuracy statistics

  23. Questions? www.georgia-navigator.com mynav.georgia-navigator.com

More Related