1 / 103

TMC Data Capture for Performance and Mobility Measures Booz Allen Hamilton HNTB TRAC-UW

TMC Data Capture for Performance and Mobility Measures Booz Allen Hamilton HNTB TRAC-UW. Project Team. Booz Allen Hamilton HNTB Corporation University of Washington’s Transportation Center (TRAC-UW ) FreeAhead TÜV Rheinland. Presentation Outline. Project Background Scope of Work

lluvia
Download Presentation

TMC Data Capture for Performance and Mobility Measures Booz Allen Hamilton HNTB TRAC-UW

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. TMC Data Capture for Performance and Mobility MeasuresBooz Allen HamiltonHNTBTRAC-UW

  2. Project Team • Booz Allen Hamilton • HNTB Corporation • University of Washington’s Transportation Center (TRAC-UW) • FreeAhead • TÜV Rheinland

  3. Presentation Outline • Project Background • Scope of Work • Results • Questions and Answers Source: WSDOT Flickr

  4. Problem Statement • Lack of accurate data inhibits ability to monitor and measure TMC performance • As data gathering becomes more automated, potential use of the data increases • Need adequate and accurate means of collecting and archiving performance, mobility and incident management data

  5. Scope of Work • Develop a Guidebook • Concepts, • Methods, • Techniques, and • Procedures for collecting and archiving TMC operation data

  6. Scope of Work • Provide technical guidance and recommended practices for TMC performance • Monitoring, • Evaluating and • Reporting For transportation system mobility and traffic incident management

  7. Scope of Work • Include: • Case studies as illustrative examples • Identify best practices and lessons learned, • at system level • for specific scenarios or locations

  8. Guidebook Audience and Users • Targeted end users • TMC managers, • Supervisors • Operators • Also anyone with a role in TMS/TMC • performance monitoring, • evaluation, and • reporting

  9. Guidebook Audience and Users • This includes: • State and local DOTs • Metropolitan Planning Organizations • Transit agencies • TMC contractors • Others having a role in the TSM&O • Representatives of EOCs, enforcement agencies and others having a role in emergency transportation operations

  10. Approach to Developing Guidebook • Conduct literature search, • Survey practitioners, and • Use the experience of the research team and project panel • Identified: • Needs and starting points of the potential users • Key issues and challenges that are common among TMCs

  11. Approach to Developing Guidebook • Acknowledges the diversity of TMCs • Activities performed • Data currently collected • Current expectations for reporting performance • Experience with collecting, reporting data • Provide a process to bridge the gap in capabilities between these diverse TMCs

  12. Literature Review • A large volume of literature exists on performance measures for TMC activities and roadway mobility • Available through: • FHWA TIM web site • FHWA Operations web site • NCHRP • NTOC • General web publications (e.g., TAMU Urban Mobility report)

  13. Literature • Performance Measures • Output measures • What activities are being undertaken • Outcome measures • What is happening on the roadway (volume/delay) • Supporting documents • Data processing, quality control procedures, and database design, communications design and requirements • Frequently correspond to very specific databases

  14. Practitioners • TMCs interviewed ranged from • Those very experienced with performance monitoring (the bleeding edge), to • Those just getting started with performance monitoring, and • Several TMCs somewhere in the middle

  15. Practitioner Survey Results • Motivations for performance reporting • Legislative mandates • Agency wide performance initiatives • Formal business linkage – particularly for operations • Use reporting to improve agency cooperation, employee actions, or to judge contractor performance • Plan future investments / resource allocation • Quantification of benefits • To assist in defense of program funding • To assist in competition for additional funding with infrastructure projects

  16. Practitioner Survey Results • Performance reporting generally grows as follows: • What we are doing? • How well are we doing those activities? • What is happening on the roadway? • How is what we are doing effecting what is happening on the roadway?

  17. Practitioner Survey Results • Initial reporting activities • Description of activities being performed • Number of incidents responded to by size of incident • Number of DMS messages posted • Size and operational condition of equipment/staff • Number of devices / staff • Number of operating devices / staff on duty

  18. Practitioner Survey Results • Second level TMC reporting activities generally include • Detailed quantification of activities being performed • Review resource deployment, trends, staff review • Detailed incident descriptives for trend reporting • Log of incident types, timestamps, actions taken • Number of incidents lasting more than 90 minutes • Time required to post DMS messages • Captured video of event activities • Size and operational condition of equipment/staff • Time required to repair equipment

  19. Practitioner Survey Results • Second Level Reporting: Mobility Reporting • Volumes, delays, reliability • Reported as travel times, congestion by location, revenue (HOT lanes), severe weather response • Most TMCs are now reporting current conditions • Not all TMCs are actively summarizing / reporting roadway performance • Increasing use of private sector probe data • Still need agency collected volume data

  20. Practitioner Survey Results • Third Level Reporting: advanced TMCs are actively working on improved analysis and reporting of • Cause, effect, timing of disruptions • Responses to those disruptions • The effect / effectiveness of those responses

  21. Third Level Activities • “After action” reports across agencies and summaries for improving agency response and interaction • Particularly use of video of incident scenes • Before and after reports on road conditions after new activities are undertaken • Benefits analyses for programs (cross cutting analyses)

  22. Guidebook Overview • Two volumes: • Guidebook • Reference Manual • Different than originally planned • More effectively meets the needs of different audiences

  23. Final Report Overview • Two volumes • Set up for users to read chapters as needed • Find what you want quickly • Helps reader consider: • Regional / local concerns • Rural / urban / suburban differences • Geographic coverage area • Current vs. future data availability • Budget justifications • Basic vs. advanced measures discussed

  24. The Guidebook Volume • Guidebook: • Executives and upper management the audience • An overview of measures and issues • The “what” and “why” of performance monitoring • Helps an agency get started in performance management

  25. Reference Manual Volume • The details of “how” to do performance management • A synopsis of each performance measure (or group of related performance measures), • An overview of each measure’s • Usefulness, • Required data sources, • Primary calculation steps or equations, • Examples of useful variations of the measure, • Issues or implementation considerations • Example applications from TMCs around the country

  26. Both Volumes • Contain introductory material • Acknowledge that each TMC is different • Provide advice that says to report • On activities performed • Over the geographic area the TMC covers • Start with “basic” measures, and grow to become more advanced to meet changing TMC needs

  27. Both Volumes • Technical material split into four sections, reflecting potential TMC roles and reporting needs • TMC operations • Incident response • System mobility • Cross-cutting • One chapter per set of measures

  28. Other Material • Guidebook • Lessons learned • General guidance and considerations • Reference Manual • Case studies • Examples • References to more details / examples

  29. Reference Manual Checklistfor every type of measure

  30. Recommended Approach • Basic measures • Simple • Common to all TMCs working in a topic area • Next compute additional measures from basic data • Supplementary information to aid management • Finally, produce advanced measures • Meet specialized needs

  31. TMC Have Different Roles & Responsibilities • Performance measures should reflect the activities performed by that TMC • TMC Operations • Incident Response • Mobility • Cross Cutting

  32. TMC Operations Chapter • Purpose & need for measures • Categories of measures • ITS infrastructure • TMC operational responsibilities • Staff performance • Specialized operations • Future trends • Data collection and management

  33. Purpose • Measure the TMC’s ability to meet goals and objectives related to performing its key functions of • Monitoring, • Operating, and • Maintaining traffic management and traveler information systems

  34. ITS Infrastructure / Traveler Info Services • Number of Devices • Coverage • Miles covered • AADT exposed to DMS Source: WisDOT Bureau of Traffic Operations, Statewide Traffic Operations Center (STOC) 2011 Annual Performance Measures Report

  35. ITS Infrastructure / Traveler Info Services • Use (agency) • Number of times equipment was used by TMC staff Source: RIDOT Traffic Management Center Incident Statistics (4/1/12 to 6/30/12)

  36. ITS Infrastructure / Traveler Info Services • Use (public) • Number of times traveler information services were used/viewed by the public

  37. ITS Infrastructure: Operational Status • Device Availability/Maintenance Activities • Average Device Availability • Number of repairs/trouble tickets by device type Source: VDOT Hampton Roads Transportation Operations Center 2011 Annual Report

  38. TMC Operational Responsibilities • Number of calls received • Number of incidents managed/responded to • Detection source/how notified Source: Houston TranStar 2010 Annual Report

  39. TMC Staff Performance • FDOT District Six identified and set targets for the key operational performance measures • Can also track staff retention / turnover Source: FDOT District Six ITS Annual Report (Fiscal Year 2010-2011)

  40. Reference Manual Checklistfor every type of measure

  41. Data Sources • Need to obtain data from existing automated systems • May require developing new output formats • Data sources • Operator event logs • DMS operating status and message log • ITS maintenance logs • Asset management systems • 511/website system usage reports

  42. Future Trends • A growing emphasis on delivery of individualized travel information • Private sector Smartphone apps • Waze • Inrix • BUMP.com • Need to make real time data feeds available • Track who is using those feeds • Downtime of those feeds

  43. Incident Management Chapter • Purpose and need for measures • Categories of measures • Traffic incident statistics • Incident time • Full-function safety service patrols • Future trends • Data collection and management

  44. Purpose of IR Reporting • Describes the need for incident response activities • Number of disruptions occurring • Describes incident attributes • Allows informed decision making • Resources needed • Appropriate deployment of those resources

  45. Incident Management: Basic Incident Statistics • Number of incidents, by: • Type or severity • Location • Time of day, day of week, time of year • Duration of incident

  46. Incident Management: Basic Incident Statistics • Track incident characteristics, e.g., • Tractor trailers/semi involvement • Lane blockages • Rollovers • Secondary crashes

  47. Incident Management: More Advanced Incident Statistics • Incident counts by duration • Major: 2 hrs+ (or 90 minutes+) • Moderate: 30 min – 2 hrs • Minor: < 30 min. Source: Manual of Uniform Traffic Control Devices

  48. Incident Management: More Advanced Incident Statistics • Secondary Crashes • a) within an incident scene or • b) within the queue, including the opposite direction, resulting from the original incident Source: RIDOT Traffic Management Center Incident Statistics 2011 Annual Report

  49. Incident Management: Advanced Basic Measures: Incident Time Incident Timeline

  50. Incident Time (Contd.)

More Related