1 / 92

The Importance of Achieving Quality Outcomes - An Overview

The Importance of Achieving Quality Outcomes - An Overview. Jennifer Johnson July 8, 2013. Overview of federal accountability system. Long-Term Move Towards Performance. Performance management is not a new phenomenon—over 50 years of work in the making to link resources with results

lorant
Download Presentation

The Importance of Achieving Quality Outcomes - An Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Importance of Achieving Quality Outcomes - An Overview Jennifer Johnson July 8, 2013

  2. Overview of federal accountability system

  3. Long-Term Move Towards Performance Performance management is not a new phenomenon—over 50 years of work in the making to link resources with results • Budget Accounting Procedures Act (BAPA) of 1950 • Planning-Programming-Budgeting System (PPBS), 1965-1971 • Management by Objectives (MBO) 1973-1974 • Zero-Based Budgeting (ZBB), 1977-1981

  4. GPRA • Originally passed in 1993 • Government Performance and Results Act (GPRA) Modernization Act of 2010 (GPRAMA) • GPRA evolved from the original development of performance measures to a more mature, modernized and important law of government that specifies requirements and measures, as well as processes for ongoing monitoring of measures.

  5. GPRAMA • Continues three agency-level products from GPRA 1993, but with changes • Establishes new products and processes that focus on goal-setting and performance measurement in policy areas that cut across agencies • Brings attention to using goals and measures during policy implementation • Increases reporting on the Internet • Requires individuals to be responsible for some goals and management tasks.

  6. Potential Benefits of GPRAMA • Adopting a more coordinated and crosscutting approach to achieving meaningful results. • GPRAMA could help inform reexamination or restructuring efforts and lead to more efficient and economical service delivery in overlapping program areas by identifying the various agencies and federal activities that contribute to crosscutting outcomes. • These program areas could include multiple employment and training programs or numerous teacher quality initiatives, among others.

  7. Potential Benefits of GPRAMA • Addressing weaknesses in major management functions. • Agencies need more effective management capabilities to better implement their programs and policies. • GPRAMA requires long-term goals to improve management functions in five key areas: financial, human capital, information technology, procurement and acquisition, and real property.

  8. Potential Benefits of GPRAMA • Ensuring performance information is both useful and used in decision making. • Agencies need to consider the differing needs of various stakeholders, including Congress, to ensure that performance information will be both useful and used. • For performance information to be useful, it must be complete, accurate, valid, timely, and easy to use. • Yet decision makers often do not have the quality performance information they need to improve results. • To help address this need, GPRAMA requires (1) disclosure of information about accuracy and validity, (2) data on crosscutting areas, and (3) quarterly reporting on priority goals on a publicly available Web site.

  9. Potential Benefits of GPRAMA • Instilling sustained leadership commitment and accountability for achieving results. • GPRAMA assigns responsibilities to a Chief Operating Officer and Performance Improvement Officer in each agency to improve agency management and performance.

  10. Potential Benefits of GPRAMA • Engaging Congress in identifying management and performance issues to address. • GPRAMA significantly enhances requirements for agencies to consult with Congress.

  11. DD Council GPRA Measures • Increase the percentage of individuals with developmental disabilities reached by the Councils who are independent, self-sufficient and integrated into the community. (Outcome)

  12. DD Council GPRA Measures • Increase the number of individuals with developmental disabilities reached by the Councils who are independent, self-sufficient and integrated into the community per $1,000 of federal funding to the Councils. (Efficiency)

  13. DD Council GPRA Measures • Number of individuals with developmental disabilities reached by the Councils who are independent, self-sufficient and integrated into the community. (Output) • Number of all individuals trained by the Councils. (Output)

  14. How are DD Council GPRA Measures Used? • AIDD analyzes the data for trends and results • ACL includes the results in budget justifications that goes to the Office of Management and Budget (OMB) and then to Congress

  15. What happens to the GPRA Measures? AIDD ACL HHS OMB Congress

  16. Why measure?

  17. What gets measured gets done. –Peter Drucker

  18. Why Measure? To Plan? To Comply? To Manage? To Optimize? To Innovate?

  19. Performance Management • Translates a mission into reality and evaluates the results to all stakeholders • Transparency of Performance to: • Donors • Elected Leaders • Senior Management • Oversight Entities • Employees • Customers • Partners

  20. Performance Management • Strategic Level • Measure Progress on Issues • Define & Validate Policy Strategies • Enhance Stakeholder Satisfaction and Support • Operational Level • Drive Change to Implement Organizational Strategies • Ensure Compliance • Achieve Efficiencies • Improve Cycle Time • Individual Level • Improved Morale/Retention • Achieve Clarity of Responsibilities

  21. Approaches to Performance Management • Program Evaluation • Performance Measurement

  22. Effective Performance Measures are SMART S PECIFIC MEASUREABLE ACCOUNTABLE RESULTS-ORIENTED (#1) TIME-BOUND

  23. Applying SMART • End-Outcome: Reduce smoking-related deaths, illness and costs • Intermediate Outcome: “Reduce the number of new youth smokers (10-18) by 2% each year” • Results-oriented: Youth smoking is where you can stop the habit before it takes hold and has a lasting health impact • Specific: “number of new youth smokers (10-18)” • Accountable: You have the ability to make it happen • Measurable: “reduce by 2%” • Time-Bound: “per year”

  24. Meaningful Reports tangible and significant accomplishments against objectives; useful for decision makers and other stakeholders Clear Easily understand by managers, partners, other stakeholders; tells clear story Legitimate Accepted or legitimated by those who must use the data Consistent Clear definition and data collection methodology across populations Reliable Captures what it purports to measure in an unbiased fashion Granular Able to detect performance movement Relevant Does not become obsolete too early: sets a pattern or baseline of performance Performance Measure Selection Criteria

  25. Technically Available data meets technical adequacy standards such as accuracy, validity, timeliness Adequate Responsible Does not have unintended and undesirable consequences Actionable Indicates what is good or bad, driving desired behavior and the timing of action Accountable Related to direct action or influence of an accountable and attributable entity Balanced One of set of measures providing a clear picture of the full range of performance Vital Measures are mutually exclusive and collectively exhaustive Feasible Reasonable cost and accessibility of data that is not already collected Performance Measure Selection Criteria

  26. Culture shift From • These measures are draining valuable resources and are a data burden • I can’t measure my outcomes I can only measure activities • I need these measures because my funders feel it is important • You can’t measure my program. • To • We are committed to tracking measures that matter most. • We are accountable for delivering our outputs and our intermediate outcomes. • We are responsible for our end outcomes.

  27. Performance Measures Definitions • Performance Measures: Indicators, stats, metrics used to gauge program performance • Target: Quantifiable characteristic that communicates to what extent a program must accomplish a performance measure • Outcome Measures: Intended result of carrying out a program. Define an event or condition external that is a direct impact the public. • Output Measures: Describes the level of activity that will be provided over a period of time • Efficiency Measures: Measures that capture the skillfulness in executing programs, implementing activities, and achieving results

  28. Not everything that can be counted counts and not everything that counts can be counted. - Albert Einstein Current state of measurement • Too Many Measures • Wrong Kinds of Measures • Too process and activity oriented • No clearly defined “Logic Model” • No measures of strategy • Few measures of end outcome • Diluting Measures • Measuring only the things you can count rather than things that are strategically important

  29. Identifying Characteristics of Effective Performance Management Systems

  30. 8 Critical Success Factors for Effective Performance Management Systems • Defining and Aligning to Enterprise Strategy • Developing Meaningful Performance Measures • Increasing Data Availability • Maximizing Data Integrity • Enhancing Performance Reporting • Improving Evaluation and Analysis • Achieving Performance Integration • Driving Decision-Making

  31. #1 Defining & Aligning to Enterprise Strategy 1.1 Has clearly defined its mission, vision and values 1.2 Has specific strategies in place to achieve organizational results (based on a SWOT or other strategic landscape analysis) 1.3 All structures (divisions, support functions) are fully aligned with enterprise-wide strategies 1.4 A formal strategic plan is clearly communicated to all employees at all levels of the organization

  32. #2 Developing Meaningful Performance Measures 2.1 Reliable measurement and reporting on Outcomes 2.2 Reliable measurement and reporting on Strategies 2.3 Organizational process metrics (Quality, Cycle Time, Efficiency) 2.4 Goals and measures enjoy support and buy-in from internal and external stakeholders

  33. #3 Increasing Data Availability 3.1 Data sources are identified and readily accessible 3.2 Data burden is worth the information gleaned

  34. #4 Maximizing Data Integrity 4.1 Data is collected, managed, and analyzed in a uniform and consistent manner 4.2 Data is validated or verified through sampling or independent means

  35. #5 Enhancing Performance Reporting 5.1 Internal reporting produces information for frontline managers and senior decision-makers on a “real time” basis. 5.2 Has a reporting system that produces comprehensive performance reports that include measures, analysis, trends, suggestions for improvement

  36. #6 Improving Evaluation and Analysis 6.1 For process measures, benchmarks and service levels are evaluated (1-2 year cycles) 6.2 For outcome and strategy measures, program performance is evaluated for “cause-effect” (2-5 year cycles)

  37. #7 Achieving Performance Integration 7.1 INTERNAL Integration: Support services’ contributions (HR, IT, Finance, etc.) to program performance is documented and managed 7.2 EXTERNAL Integration: Performance contributions of multiple contributors in same measurement area are tracked and compared

  38. #8 Driving Decision-Making 8.1 Budgets and investments are made based on clear contributions to performance 8.2 Supply chain partners are held accountable for products and services 8.3 Employee bonuses and pay increases are linked to individual performance evaluations.

  39. Understanding Logic Models

  40. What is a logic model? • Logical chain of events providing blueprint for mission achievement • Graphic representation that illustrates the rationale behind a program or organization • Depicts causal relationships between activities, strategies, and end results • Contains goals and performance measures • Integrates various program activities into a cohesive whole • Vehicle for dialogue, planning, program management and evaluation

  41. What does a logic model look like? • Graphic display of boxes and arrows; vertical or horizontal • Relationships, linkages • Any shape • Circular, dynamic • Cultural adaptations, storyboards • Level of detail • Simple • Complex • Multiple models

  42. Logic modeling is based on mapping and defining linkages between what we do and why we do it . Series of If-Then Relationships IF THEN IF THEN IF THEN IF THEN Have Better Image, Feel Better & Live Longer I Work Out for One Hour Each Day I Will Burn More Calories Than I Consume Lose Fat and Build Muscle Improve My Looks and Health INPUTS OUTPUTS OUTCOMES Assumptions: improving looks = better self image Factors: Health History

  43. Clarifying the terms Inputs People and resources required to achieve outcomes Activities/Outputs What the inputs produce End Outcome End goal or ultimate benefit Immediate and Intermediate Outcomes Changes required to achieve end outcome Assumptions: beliefs or evidence that supports your IF-THEN logic Factors: external influences beyond control that effect IF-THEN relationships

  44. Most logic models incorporate the following elements. EFFECT CONTROL Inputs Activities Outputs Intermediate Outcomes Attitudes Behaviors Conditions End Outcomes WHY? HOW

  45. Resources used to support activities • Not performance measures • System integration: Link inputs through process to goals and outcomes in: • Budget requests • Human resource plans • Information technology plans Drivers Inputs Activities Intermediate Outcomes End Outcomes Outputs

  46. Drivers Inputs Activities Intermediate Outcomes End Outcomes Outputs • Processes and roles • What the program does • Subject of on-going process improvement and strategy change • System integration through linkages to: • Stakeholder satisfaction • Assessment quality • Desired outcomes • Efficiency measures

  47. Drivers Inputs Activities Intermediate Outcomes End Outcomes Outputs • Products and services delivered (e.g. grants, audits, research studies, impact assessments, etc.) • Indicate strategy deployment • Foundation step for attainment of all types of outcomes • Often measured by low-level outcome types • Process vital signs • Customer satisfaction • Good source of short-term, readily available results • 12 months-1 year

  48. Drivers Inputs Activities Intermediate Outcomes End Outcomes Outputs • Show cross-agency/program accountabilities • Can be one or several in number • Often measured by attitudes, behaviors and conditions • Can show short- to medium-term change • 1-5 years

  49. Connecting strategies, intermediate outcomes and measures Intermediate Outcomes U.S. Department of Labor Women’s Bureau Strategy: Provide training in high-growth, demand-driven occupations to women Intermediate Outcome: Increase hard skills in high-growth, demand-driven occupations for participants Intermediate Outcome Performance Measure: % of women participants who successfully complete training, education or certification for high-growth, demand-driven occupations

  50. Drivers Inputs Activities Intermediate Outcomes End Outcomes Outputs • Shows ultimate benefit to tax payer • Often measured by long-term indicators • Changes in economic, policy conditions • Captured in the strategic plan as end goals • 5-10 years

More Related