1 / 32

CVSA and FMCSA

CVSA and FMCSA. SSDQ Performance Measures: Crash Timeliness Inspection Timeliness. April 21--22, 2013 CVSA Spring Workshop. Introduction. Candy Brown Presenter, Timeliness Performance Measures and Reports Kevin Berry Presenter, Improvement Strategies. Agenda.

anitaalbert
Download Presentation

CVSA and FMCSA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CVSA and FMCSA SSDQ Performance Measures: Crash Timeliness Inspection Timeliness April 21--22, 2013 CVSA Spring Workshop

  2. Introduction Candy Brown Presenter, Timeliness Performance Measures and Reports Kevin Berry Presenter, Improvement Strategies

  3. Agenda • Overview of Timeliness Performance Measures • Why Timeliness Matters • Training Objectives and Expected Outcomes • How State Ratings Are Determined • How to Interpret Data Quality Reports • When and How to Improve Data Quality

  4. Overview of Timeliness Performance Measures • Crash Timeliness: the percentage of fatal and non-fatal crash records submitted to Motor Carrier Management Information System (MCMIS) within 90 days of the crash event over a 12-month period • Inspection Timeliness: the percentage of inspection records submitted to MCMIS within 21 days of the inspection event over a 12-month period

  5. State Safety Data Quality (SSDQ) Measures Driver IdentificationEvaluation Vehicle IdentificationEvaluation Inspection RecordCompleteness VIN Accuracy Timeliness Accuracy Driver IdentificationEvaluation Vehicle IdentificationEvaluation Crash Consistency (Overriding Indicator) Record Completeness Non-FatalCompleteness FatalCompleteness Timeliness Accuracy Overall State Rating

  6. Why Timeliness Matters The Safety Measurement System (SMS) Behavior Analysis and Safety Improvement Categories (BASICs) need timely data to target the right carriers for safety performance interventions. • The SMS weighs recent events more heavily; missing events mean that carriers’ SMS BASICs could be better or worse than they should be • Carriers may not be targeted for investigations • Inconsistent timeliness among States could skew the SMS results in favor of carriers operating in States that report late SMS Weights Crashes by Time Event Date Weight 3 Weight 2 Weight 1 Most recent 6 months 7-12 months 12-24 months

  7. Training Objectives • Explain the Crash and Inspection Timeliness performance measures • Explore Timeliness reports • Show how data collection and processing errors can affect Timeliness ratings • Identify FMCSA resources for improving data quality

  8. Expected Outcomes • Understand Timeliness performance measure methodology • Interpret Timeliness rating results • Interpret Timeliness State Data Analysis Reports (SDAR) and custom reports/SAFETYNET Queries • Identify potential sources of collection and reporting issues • Identify FMCSA resources for improving data quality

  9. How State Ratings Are Determined Training

  10. Methodology Crash Timeliness • Determines a crash rating (Good, Fair, Poor) based on the percent of records reported to MCMIS within 90 days of the crash event • 12-month time span • Evaluates fatal and non-fatal crash records Inspection Timeliness • Determines an inspection rating (Good, Fair, Poor) based on the percent of inspection records reported to MCMIS within 21 days of the inspection event • 12-month time span • Evaluates inspection records

  11. Evaluation Period = Event Date Range 12 Months of MCMIS Data • Based on event date, not upload date • “Rolling” 12-month period • Excludes the most recent 3 months

  12. Ratings • Timeliness ratings calculated every month • Results posted on the A&I Data Quality Website Number of Records Reported On Time Number of Total Records Evaluated = Percent of Timely Records

  13. How to Interpret Data Quality Reports Training

  14. How to Use Data Quality Reports Three types of reports: 1 Rating Results 2 State Data 3 Custom Reports & Analysis Reports SAFETYNET Queries What you can do with them: Spot trends in reporting Identify how late records are when reported Monitor upload frequency to MCMIS Identify late records by jurisdiction

  15. Monthly Rating Results Event Date Range is 1/1/2012 – 12/31/2012 12 Months of MCMIS data Not Actual Data MCMIS snapshot was taken March 22, 2013

  16. Crash Timeliness Ratings How to Interpret • Report displays the last 13 ratings in a bar chart and a table • Each rating based on the percentage of timely records in MCMIS • Compares current and previous results to identify trends When to Act • Unusual or significant change in percent or number of timely records • Slow decline in rating • Even when the rating is Good

  17. State Data Analysis Reports (SDAR) How to Interpret • Details of evaluation period • Timeliness of records by month of event • Trends in timeliness and counts When to Act • Downward trend in timeliness • Change in counts between months Crash Record Timeliness − Monthly Analysis

  18. SDAR (cont.) Crash Record Timeliness − Number of Days Between Record Uploads to MCMIS How to Interpret • Three days or more between uploads to MCMIS • Trends in timeliness and volume When to Act • Frequent instances of uploads taking place over three days apart • Significant change in volume

  19. SDAR (cont.) Inspection Timeliness − Records Reported by Inspector How to Interpret • Sort by: • Inspector ID • Number or percentage of on-time and late records • Total evaluated records When to Act • Inspectors with high numbers/ percentage of late records • Widespread distribution of late records Insp #’s hidden from view

  20. Custom Reports and SAFETYNET Queries • Explore specific data quality issues: • Crash event date by upload date • The timeliness of records in each upload to MCMIS • When there are late records or change in counts • Comparison of timeliness by agency • Create a SAFETYNET query, including: • List of records input late to SAFETYNET

  21. Custom Reports Crash Event Date by Upload Date Event Dates

  22. Custom Reports Comparison of Timeliness by Agency

  23. SAFETYNET Queries Days from Event Date to SAFETYNET

  24. When and How to Improve Data Quality Training

  25. Data Collection and Reporting Process Key to improving Crash and Inspection Timeliness: Understand your State’s collection and reporting process. All Crashes Meeting FMCSA Criteria Must Be Reported On Time Collect Select Report Law Enforcement State Organization MCSAP Office

  26. Crash Data Collection and Reporting Process

  27. Collect and Transfer Crash Reports Promptly Possible Actions by Law Enforcement • Ensure officers understand the importance of timeliness to FMCSA • Formal training • Feedback to individual officers and/or agencies • Ensure reports are transferred promptly • Prioritize FMCSA crash reports • Pay attention to system updates or changes in electronic collection Review/ Correct Transfer Collect Data at Scene

  28. Process and Transfer Crash Reports Promptly Possible Actions at State Crash Repository • Assess applicable procedures to ensure records are reviewed and promptly transferred to the MCSAP Office • Prioritize FMCSA crash reports for processing • Prioritize FMCSA crash reports for transfer to MCSAP Office • Track crash reports sent back to officer for correction • Pay attention to system updates or changes in electronic transfer to ensure records are not delayed Transfer Receive Review/ Input ID FMCSA Reportables Forward Report

  29. Process and Upload Crash Reports Promptly Review/ Input to SAFETYNET ID FMCSA Reportables Receive Forward Report Upload to MCMIS • Possible Actions in the MCSAP Office • Identify and implement improvements for processing and uploading reports • Validate number of reports received from the State crash repository • Track crash reports sent back to officer for correction • Consider SMS weightings and Timeliness cut-offs when prioritizing backlogs • Address backlogs by adding/reassigning staff • Upload to MCMIS daily • Check activity logs daily for rejected records

  30. What to Do Next Interagency Coordination: How Does It Work in Your State? State Police State Crash Agency Local Law Enforcement Agencies MCSAP Office Other State Agencies

  31. Contacts Candy Brown SSDQ Measure Development and Analysis Candace.Brown@dot.gov 617-494-3856 Kevin Berry Technical Analyst Kevin.Berry@dot.gov 617-494-2857

  32. Training Recap I am now able to: • Understand Timeliness performance measure methodology • Interpret Timeliness rating results • Interpret Timeliness SDAR • Identify potential sources of collection and reporting issues • Identify FMCSA resources for improving data quality

More Related