1 / 17

Applying a Life-Cycle Approach to Information Quality

Applying a Life-Cycle Approach to Information Quality. ARRA Performance Data Office of the Chief Financial Officer and the Office of Environmental Information May 2009. What are performance data?.

salena
Download Presentation

Applying a Life-Cycle Approach to Information Quality

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying a Life-Cycle Approach to Information Quality ARRA Performance Data Office of the Chief Financial Officer and the Office of Environmental Information May 2009

  2. What are performance data? • Data collected/used to monitor and report a program’s accomplishments (its activities, outputs, outcomes) relative to a target • Performance data can be environmental, administrative (count), activity, survey, you name it

  3. ARRA of 2009

  4. OMB’s guidance regarding quality assuring ARRA-performance information: • “…agencies are responsible for pre-dissemination review of all information that will appear on Recovery.gov. All agencies must ensure all reporting related to ARRA funding is complete and accurate and complies with the agency's Information Quality Act guidelines. Each agency will provide its point-of-contact for quality information on its Recovery.gov page.”

  5. Other accountability objectives in ARRA: • Public benefits of funds reported clearly, accurately, in timely manner • Funds used for authorized purpose • No unnecessary project delays and cost overruns • Programs meet their goals and targets, and contribute to improvement of broad economic indicators

  6. OMB Guidance on Performance Measures under ARRA • Performance measures are expected quantifiable outcomes…with each outcome supported by a corresponding quantifiable output(s)—to indicate incremental change against present level of performance • Programs must specify results reporting period (e.g., monthly, quarterly), measurement methodology, and how the results will be made accessible to public. • Use of existing measures will allow the public to see the marginal performance impact of the Recovery Act investments and will reduce burden.

  7. ARRA Performance Measures—Available in Draft: • Developed by Programs receiving ARRA funds • Mix of Outcomes/Outputs[1] • Many are existing planning and annual commitment measures • “Green” measures and economic efficiency measures (e.g., jobs created/preserved) • Measures developed by OIG to detect waste, abuse and fraud [1] Number of existing heavy duty diesel engines (including school bus engines) that have been retrofitted, replaced or retired (DERA); Number of states that have awarded all of the 20% green project reserve (CW SRF); Number of jobs leveraged (Brownfields); Number of criminal, civil and administrative actions (OIG)

  8. Act Evaluate measure/data quality options Assess costs & benefits of options Revise measures & QA plans as warranted • Plan • Performance Measures Development • Program performance questions • Data/information sources & definitions • Acceptance criteria/data quality objectives (award conditions) • Data/Info QA Plan • Sampling & lab analysis plans/protocols • Reporting & record keeping • Database design and operation • Data input protocol • Statistical analysis & presentation • Roles & responsibilities Check Check quality processes for adequacy Review oversight results for compliance with QA plans Document issues, problems & deviations Characterize validity of a priori assumptions about measure/data quality Assess implications for intended use of/goals for performance measure/data Identify options to improve performance measure/data quality • Do • Data/info collection & analysis • Note issues/deviations • Data recording, storage, access, aggregation, etc. • Oversight, corrective actions & implications • Analyze & interpret • IQG pre-dissemination review • Present/communicate Performance Measure/Data Quality

  9. Management Action Plan I • Explains EPA’s Quality Program’s role under ARRA • References existing grants, contracts, and IAs receiving ARRA funds and specifies which have environmental data collection components with Quality Management Plans and/or Quality Assurance Project Plans • Good check for ARRA environmental measures that have robust Plan/Do quality assurance

  10. Management Action Plan II • Broadens definition of performance data to include administrative data and other types • Lays out “State of QA” for each performance measure • Inventory data standards and definitions, management controls for reporting and oversight, data flow • Identifies gaps in QA processes & responsibilities • Requires programs to self-certify completeness of data quality processes through a signed Pre-dissemination Review Bottom line purpose (of PDCA, too): To minimize end-of-pipe errors in reporting performance results!

  11. Pat’s topics • Where we are/were headed when ARRA intervened – larger vision/context • What MAP II represents compared to the whole effort: limited scope but progress in involving the PM community • Progress outside of MAP II • Opportunities for participation

  12. Performance Measurement • Goal – to further QA in all OMB-related performance measures for EPA • Beyond performance data reporting from ARRA-funded programs • Broaden to an Agency-wide QA context • Plan, Do, Check, Act – quality life cycle • Information Quality Guidelines (IQGs) • CIO 2106 – new Quality Policy

  13. ARRA Progress • MAP I QA of environmental data in ARRA financial instruments • MAP II QA adds performance data and information for ARRA measures • Communication with and input from: • NPO & Regional QA communities • IT/IM community (some affected by ARRA) • Stimulus Steering Committee - planners from 5 program offices

  14. What’s NOT in MAP II • QA of performance measures in the life cycle framework (ARRA measures are set, ICR issue) • IT/IM QA for all types of data bases • Enterprise systems (CERCLIS, SDWIS) • End-user systems (spreadsheets) • Guidance from IT/IM QA standards references • GAO auditing protocols (FISCAM) • Corporate accounting firms • OEI policy • QA review/approval of the procedures

  15. Performance Measurement QA • Plan with stakeholders, internal & external • Apply the QA graded approach to assure • Adequate QA resources for oversight • Performance measure & data are adequate for intended use • Document, validate & verify data flow processes • Use life cycle “Check” and “Act” for evolution to environmental outcomes

  16. Opportunities for Input • IT/IM Managers: Performance data quality expectations for • Large systems • Small spreadsheet scale systems • QA Managers: QA adequacy of • Non-environmental data and information used to support the budget process • Environmental data with additional intended uses in performance reporting • Performance reporting process

  17. More Opportunities for Management Input • Senior managers: views on certification • Program managers • Consult with IT/IM & QA managers on PDCA application to design performance measures more efficiently & apply graded approach • To ensure performance data are adequate • To ensure data flow procedures are effective • To ensure verification & validation are transparent and documented

More Related