1 / 23

EVM Central Repository: Reporting Compliance & Data Quality Findings

EVM Central Repository: Reporting Compliance & Data Quality Findings. January 29, 2009. Russ Vogel Acquisition Resources and Analysis, Enterprise Information & OSD Studies Office of the Under Secretary of Defense for Acquisition, Technology and Logistics Russell.vogel@osd.mil

filbert
Download Presentation

EVM Central Repository: Reporting Compliance & Data Quality Findings

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. EVM Central Repository:Reporting Compliance & Data Quality Findings January 29, 2009 Russ Vogel Acquisition Resources and Analysis, Enterprise Information & OSD Studies Office of the Under Secretary of Defense for Acquisition, Technology and Logistics Russell.vogel@osd.mil (703) 845-6677

  2. Overview: How We Use The Data Providing Timely and Accurate Key Acquisition Data

  3. EVM-CR Metrics (as of 1/5/2009) • Number of programs reporting: • MDAP: 69 • MAIS: 11 • Number of contracts/tasks reporting: • MDAP: 166 • MAIS: 17 • Number of Submissions • Total: 3586 • Monthly: >250 New Submissions • Number of Users • Total: 1303 • Active: 394 • Downloads Per Months: >2000

  4. EVM-CR Reporting Compliance • In September 2008, began first level data quality review of submission to the EVM-CR • Goals • Compliance with CDRLs: timeliness and completeness (e.g., are all the required formats being submitted) • Compliance with policy: data submission form factor (e.g., EDI) • Alignment to other OSD (and service) data systems (e.g., DAMIR, AV SOA) • Coordinating activities with Service-level Acquisition Staff to make corrective actions as required. • Findings: Most programs making effort to meet requirement • As of Jan 2009, many of the identified issues are being worked on and we’re seeing good progress. • As expected: normal, day to day “mistakes” (e.g., forgot to hit the SUBMIT button) • Beginning to tackle the more difficult issues

  5. EVM-CR Reporting ComplianceProcess & Data Quality Issues • Easy fixes (requiring minor software changes, better guidance/training) • Mislabeling submission file (e.g., identifying an XML file as an X12 TS839) • Data errors (contract numbers, contractor name, “as-of date” not as expected) • Missing Components of the submission (e.g. no Format 5) • For example, submitting a partial data package on due date and then on a subsequent day submitting missing file(s) as a “re-submit” • Requirement: submit a complete data package as a single event • Submissions against wrong submission events • Data Validations Rules / Sanity Check • Examples • BCWS must be less then or equal to BAC • TAB must equal CBB unless an OTB is specified • Potential Causes • Incomplete submission (missing data in EDI/XML file) • Some issues identified (and corrected) in EVM-CR data processing routines that read EDI/XML files • Validation rules may need to be updated (theory vs. reality)

  6. EVM-CR Reporting ComplianceProcess & Data Quality Issues • Submitted EDI or XML files not processing • Again, some issues identified and corrected in EVM-CR data processing routines that read EDI/XML files • Non-compliant XML • Current policy / requirement is EDI X12 compliant with TS-839 format (e.g., WINSIGHT TRN format) • EVM-CR also supports XML consistent with WINSIGHT XML-Schema • Not all XML files are acceptable (e.g. Excel saved as XML) • What should be included in XML • WINSIGHT too flexible • Need: one reporting period at “reasonable” reporting level • Example: In one case a single XML file is 300+ megabytes requiring “zipping” • May require re-examining CDRL language and DID language to clarify or tighten requirement

  7. Significant Alignment Issues • EVM-CR “contract tasks” not aligned with “effort numbers” • “Effort Number” is an internal OSD database key used for organizing reporting activities below the contract level • EVM-CR submissions at Contract or lower “Contract tasks” levels • Used when multiple CPRs are required on a contract each month • E.g. A “total CPR” plus multiple component CPRs • Reporting by CLINS / Delivery Orders (e.g. T-AKE, DDG, AB3, MPS, …) • Teaming relationships (e.g. V-22) • Problems • Component CPRs may or may not sum to total properly • Multiple component CPRs provided without total (or total not EDI) • Typically ad-hoc methods (comment) for identifying total CPR • Internal OSD/Government Alignment Issues • Contract reporting requirements need to be aligned with presentation and data needs of leadership (and IT systems) • Effort numbers without matching tasks • Tasks without matching effort numbers • Aligning CDRLs to EVM Policy (e.g., reducing tailoring such as elimination of EDI requirement)

  8. Summary • Why Is This Important? • Automation/integration with other data systems (AV SOA, DAMIR, MilDep, etc) provides timely access and visibility to contract reporting levels for stakeholders and decisions makers. • Data is being used by senior leadership: e.g., trip-wire analysis • Alignment necessary: Reporting must role up to contract level properly • No room for double counting or gaps • Current Activities • Continue working with Services to coordinate corrective actions on data alignment and quality issues. • Updating EVM-CR and on-line documentation and training materials to clarify requirements • Working the bigger issues case by case • Looking at CDRLs and alignment to policy • Aligning data submissions to other OSD IT systems

  9. Backups

  10. Reporting Status

  11. Reporting Status

  12. Reporting Status

  13. AV SOA Governance and Technical Approach Acquisition Services UsersDefense Acquisition Decision Making Governance of Data: • Definition of key data elements • Assignment of responsibility for the authoritative copy of the specified data elements • Provision of access to governed data Business Tools Business Applications Web UserInterfaces Discoverable and Accessible Discoverable and Accessible Exposure Access to Authoritative Data D a t a G o v e r n a n c e Enterprise Services Enterprise Services D a t a G o v e r n a n c e Authoritative Data Army Air Force Navy DoD Federal Other SOA Separates Data from Application and Tools

  14. AT&L AV SOA Pilot Data Services • Data brought under governance for the pilot include 140 elements in the following major categories, which correspond to the AT&L AV SOA services • EVM – EVM elements used in the Demo, plus contract elements included in DAMIR’s “Contract Data Point” and/or reported on the Contract Performance Report (CPR) • Nunn-McCurdy Unit Cost – Current estimate vs. APB (current and original) at total-appropriation level (RDT&E & Procurement), by fiscal year for comparison • Budget – Current President’s Budget and POM/BES submission, by appropriation and fiscal year, to provide a reference point for analysis • Milestone – Program milestones as agreed upon in the APB • Science & Technology – To compare Key Performance Parameters, thresholds, and objectives to current measurement and to identify critical technologies • Program Administration – To organize/view information by program, sub-program, budget activity, program element, budget line item, and/or project code

  15. AT&L AV SOA Pilot – As of 1/8/2009 Army AIM Air Force Static Source Army Static Source Navy Static Source Navy Dashboard OSD/PA&E CR Data System Manager and Location Application/Tools Used # of Programs Data Repository Repository Location Authoritative Data Available Unavailable or Static Data Displays Published Army WS SOA Enterprise Service Bus (ESB) Radford Army Ammunition Plant Radford, Virginia PEO EIS Ft. Belvior, Virginia • 10 • 3 S&T Elements • Contracts • Cost & Funding • Performance • Schedule • Unit Cost • Track to Budget DAMIR Navy WS NMCI Navy Annex Arlington, Virginia ASN RD&A (Management & Budget) Arlington, Virginia SPAWAR Charleston, South Carolina • 15 • 1 S&T element • 1 Admin element AF WS • Contract EVM • Nunn-McCurdy • Milestones • Contracts by Location • Performance K-Scope AT&L WS 754th ELSGGunter AFS Montgomery, Alabama 754th ELSGHanscom AFB Massachusetts Air Force SMART • 12 1 Admin element SPAWAR Charleston, South Carolina DAMIR WS OSD/ARA DAMIR AT&L Arlington, Virginia • All of the Above AT&L Arlington, Virginia • 12 elements • Current APB • Contract Details • Contract EVM • Nunn-McCurdy • Budget • Milestones • Science & Technology AV SOA Portal CR WS SPAWAR Charleston, South Carolina • 27 PA&E Arlington, Virginia PA&E Arlington, Virginia SPAWAR Charleston, South Carolina • 58 elements • EVM data • Data cleanup needed on some contracts Data Source Data Display

  16. AV SOA Overview

  17. AV SOA Methodology to aggregate Contract and EV data

  18. AV SOA EV Data Findings • To date USA AIM, CR, DAMIR have been integrated with AV SOA. • For the pilot : • The most current EV data is often from Army and not from CR • The most current CR EV often exists in PDF format which is not consumable by AV SOA web services • Contract Effort Number is missing from CR data. It is either null or zero. • We will pull from the latest EV Source (MILDEP or CR) • Extra logic and error handling was required to map CR

  19. AV SOA EV Data Findings: As Of Date/Report Period End Date • Discriminating Field for AV SOA. • For reference: • As of – Central Repository • Reporting Period End Date – AV SOA • Report Date – MILDEP (AIM) • Used to determine most current available CPR/EVM report. • Report Date is initially pulled from the MILDEP contract and then used to align data from CR. • Using a CPR As Of Date AV-SOA programmers select the most recent report. Within AIM we often find a Report End Date more current. • Often our logic is defaulting to incorporating EV data from MILDEP. • Recommend that MILDEP Report Date and CR As Of Date should be identical for a given Reporting Period.

  20. AV SOA EV Data Findings: Non Consumable Data • CR has no error/status code that CPR data is not consumable. Only indicator is a blank file. • It appears that consumable report availability is lagging within CR. • Recommend CPR consumable status be reflected through CR Web Services.

  21. AV SOA EV Data Findings: Mismatch in Contracts from MILDEP and CR • AV-SOA is obtaining an MDAP’s Contract list from the MILDEP system. • Contract listing is used to query CR . • For Example : MIM-104D (Patriot MEADS CAP) – PNO 531 • DAAH01-02-C-0075  (Not being reported by MILDEP) • DAAH01-03-C-0164 (Reported by both) • NAMEAD-04-C-6000 (Reported by both) MILDEP only shows 2 contracts.

  22. AV SOA EV Data Findings: Contract Effort Number • AV SOA Definition : The effort number within the contract if multiple efforts within the contract are being tracked separately. DI MGMFT 81466A: Enter the contract number and the applicable Contract Line Item Number (CLINs). • AV SOA “required” field • Often in contracts without multiple efforts where no separate Contract Effort Number is delineated. AV-SOA is currently publishing a “-1”. • Recommendations?

  23. For Official Use Only – Pre-Decisional

More Related