1 / 80

Training Session Sponsored by the Association of Government Accountants

Training Session Sponsored by the Association of Government Accountants. A Systems Approach to Implementing Performance-Based Management and Budgeting. Audio Conference March 21, 2012. Today’s Presenters.

tareq
Download Presentation

Training Session Sponsored by the Association of Government Accountants

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Training Session Sponsored by the Association of Government Accountants A Systems Approach to Implementing Performance-Based Management and Budgeting Audio Conference March 21, 2012

  2. Today’s Presenters • Stephen L. Morgan, President, EGAPP, Inc., and former Austin City Auditor can be reached at egappmorgan@yahoo.com • Sam McCall, City Auditor, Tallahassee, Florida and former Deputy State Auditor of Florida, can be reached at Sam.McCall@talgov.com

  3. A Systems Approach to Performance Based Management and Budgeting • Introduction – Performance Accountability System • Historical Overview – Where We Have Been in Austin and Beyond • Performance Planning • Performance Budgeting • Performance Measurement and Reporting • Performance-Based Decision Making • Conclusion – What We Have Learned and Where We Are Going

  4. I. 1. Introduction: Government Performance Accountability System PLAN Strategic & Annual Planning DO Performance Budgeting ACT Performance-Based Decision Making CHECK Performance Measurement & Reporting

  5. I. 2. Managing for Results FrameworkCity of Austin PERFORMANCE-BASED DECISION-MAKING BUSINESSPLANNING • Program/Activity Objectives • Organizational and Individual Performance Measures • Structural Alignment • Citizens • Council • Managers • Employees • Individual SSPR Evaluations • Organizational Performance Assessment • Performance and Measurement Audits • Performance Targets • Accounting System PERFORMANCE BUDGETING PERFORMANCE MEASUREMENT & REPORTING

  6. I. 3. Characteristics of a Successful System • Use existing data whenever possible • Find a balance between too few and too many measures • Audit the data regularly • Modify measures when necessary • Centrally located staff to analyze data and coordinate the system elements • Technological infrastructure to support the system

  7. I. 3. Characteristics of a Successful System (continued) • Data forms should have space for explanatory information and detail • Tie measures to budgetary allocation and reward system • Support of top management • Over the long run should affect bottom line performance of the organization • Citizens will be better informed and more participative

  8. II. 1. Where We’ve Been (in the City of Austin) … • 1992 – Council Resolution on Performance Measurement and Reporting • 1994 – First Performance Measurement & Reporting System Audit • 1996 – Second Performance Measurement and Reporting System Audit; Program Budgeting implemented • 1998 – Third Performance Measurement and Reporting System Audit

  9. II. 2. Where We’ve Been… • 1998 Corporate Managing for Results Initiative Defined • Simplify our System • Clarify the Information We Provide • Develop Measures that are Meaningful to our Employees • Focus on Cost • Developed a Standard manual--The Resource Guide • Trained over 200 managers • Developed a Single Accounting System • Identified Key Performance Measures for Executive SSPRs • Corporate Review Team • 1999 Corporate Partnership Implements CMO Initiative

  10. II. 3. Where We’ve Been… • 2002 Fourth Audit of the Performance Management System • Ongoing Integrated System • Information Used for Operational Management • Measures Are Relevant and Reliable • Budgets Are More Data and Results Driven • Managers and Supervisors Fully Trained • Performance Measures Supported by More Robust Technology • Improvements Made to City’s Website and Stakeholder Access to Performance Information • Citizen and Employee Surveys Provide Data for Selected Performance Measures • 2003-2008 Continuous Improvement

  11. II. 3. Where We Are Now… • 2008-Current • Website Robust with Capacity to “Drill Down” and Search” through Performance Measures Database • “Managing for Results” Used as Business Planning and Performance Monitoring Model for More than a Decade--Now Part of City Culture • Performance Report on Website tracks 115 Key Departmental Measures, of these 21 are Designated Citywide Key or “Dashboard” Measures • Performance Comparisons Presented in Graphics with Goal/Targets and Measures Tracked Over Five Years • Performance Report for 2009-2010 Received “Certificate of Excellence” from ICMA in October 2010 • Annual Citizen Surveys Strengthened to Include Focus Groups and Presentations to City Council • “Best Practice Citizen Centric” External Performance Accountability Report Is Needed

  12. II. 4. Beyond Austin Federal Government Performance Management Continues to Evolve • Government Performance and Results Act of 1993 • Executive Order 13450-Improving Government Program Performance, Nov 13, 2007 • OMB 10-24: Performance Improvement Guidance under GPRA for 2011-2012 • Government Performance and Results Act Modernization of 2010 (signed Jan. 4, 2011)

  13. II. 4. Federal Agencies with Well Developed Performance Management Systems • Social Security Administration • Department of Interior • Government Accountability Office • Nuclear Regulatory Commission • Office of Personnel Management

  14. Some Local, State, & Provincial Governments Have Established Performance Management Mandates

  15. State and Local Governments With Well Developed Performance Management • States of Florida, Washington, Texas, Missouri, and Oregon (may have been recognized for individual State departments who are mature and excel in developing and applying performance management systems) • Local governments include Austin, King County, Phoenix, Bellevue, Charlotte, Portland, Palo Alto, and Tallahassee • Auditors have played key roles in many performance measurement and management initiatives

  16. III. Performance Planning III. 1. Establishing programs, activities, and potential performance expectations III. 2. Developing annual business/performance plans with performance expectations and measures III. 3. Reviewing business/performance plans to support improvement and accountability

  17. Other Contributing Factors Input Process Output Intermediate Outcome Community Impact Long-term Outcome III.1. Service Delivery System (Program Model)

  18. Inputs Processes Outputs Outcomes Service Accomplishments Service Efforts Financial Inputs/Outputs = Unit Cost Outputs/Physical Inputs = Productivity Inputs/Outcomes = Cost Benefit and Cost Effectiveness III. 1. Service Delivery System Cause/Effect Relationships

  19. Government Performance Expectations MISSION PERFORMANCE OBJECTIVES/GOALS OUTCOME Effectiveness PROCESS Efficiency OUTPUT Effectiveness INPUT Economy & Sufficiency • Mission & Outcome Goal Achievement • Financial Viability • Cost-Benefit • Cost-Effectiveness • Financial: Amount, timing • Physical: Quantity, quality, timing, price • Capacity vs. demand • Productivity • Unit Costs • Operating ratios • Quantity • Quality: Products, delivery • Timeliness • Price or cost 19

  20. Inputs • Staff • Funding • Equipment • Facilities/Rent • Outputs • Reports • Briefings • Presentations • Processes • Audit Process-(Survey, fieldwork, & reporting) • Outcomes • Qualitative – Policy/system/ management improvements • Quantitative – Cost savings/ revenue enhancement • Preventive – Deterrence/ detection III. 1. Service Delivery System: Auditing Program Audit Program or Activity

  21. Outputs (Services Delivered) Outcomes (Results) Inputs Process III. 1. Program/Activity Mapping Template

  22. III. 1. Austin’s Definition of Programs • Activity = Input Process Output Outcome • Program = group of activities with a common purpose Example: Audit Program consists of four activities: • Performance Audits • Investigations • Consulting and Assistance • Quick Response

  23. Common Purpose Service Performance Measures Environmental = Activity A Service Key Result A § Result Scan Service § Output Performance å Activity Service § Efficiency Measure Common Purpose Objective § Demand = Program Change å Program Dynamics Common Purpose Service Objective Performance Measures Key Result B Service = Activity B = § Result Results Performance Service § Output å Accomplishment of Activity Measure Service § Efficiency M · Key Result A Objective § Demand · Key Result B I · Key Result C G S Common Purpose Service Performance Measures O = Activity C Key Result C § Result Service S Performance § Output A Service å Activity § Efficiency I Measure Service Objective L § Demand O Common Purpose S = Program N Common Purpose Service å Program Performance Measures = Activity D Service Key Result D § Result Objective Service § Output Performance = å Results Activity Service § Efficiency Measure Change Objective Accomplishment of § Demand Dynamics · Key Result D · Key Result E Service Common Purpose Performance Measures Service = Activity E Key Result E § Result Service Performance § Output Environmental å Service Activity § Efficiency Measure Scan Service Objective § Demand III. 2. Overview of the Development of Business Plans

  24. III. 2. Business Plan Alignment Worksheet with Definitions ALIGNMENT WORKSHEET BY ACTIVITY

  25. III. 2. Sample Business Plan Alignment Worksheet

  26. III. 3. Reviews: Corporate Improvement and Accountability • Review Team • Budget Office, Organizational Development, City Management • Structure • Does it provide for alignment of results? • Does it permit illumination of results and cost information in a manner useful to decision makers? • Results • Do objectives and measures match? • Was template used for best impact? • Measurability • Are goals measurable? • Are program and activity measures useful?

  27. III. 3. Plans: Consistent Process & Product • Program and Activity Objectives: MFR Template • Performance Measures: A Family of Measures The purpose of ________________ is to provide___________________ to ___________________________ so they can __________________ • Result Measure…then • Outputs: How many? • Efficiency: At what cost? • Anticipated Demand

  28. III. 3. Activity Objective Statement (example) The purpose of theCombat Operations (program) is to provide/produceemergency incident response (service or product) toanyone in the service area(customer) in order tosave lives and minimize property damage (planned benefit)

  29. III. 3. Performance Measures (example) Result: Number of fire deaths per capita Percent of fires confined to the room or area of origin after arrival of AFD (per census track) Efficiency: Average cost per call Output: Number of calls (call volume) Demand: Number of fire alarms (calls) expected

  30. IV. Performance Budgeting IV. 1. Link annual plans and budgets IV. 2. Establish targets IV. 3. Collect cost accounting information

  31. IV. 1. Link Annual Performance Plans and Budgets • Ensure clear linkage between the plan’s programs and the budget’s programs • Ensure congruence between the plan’s goals, objectives, and targets and the budget’s goals, objectives, and targets

  32. IV. 1. The Budget – Linking Results, $$$, and People • In the Budget Document • Business/Performance Plan • Activity and Program Pages • Performance Measures: definitions, etc. • Using the Performance Budget to “Tell Your Story” • Changing the Conversation • This Result…At This Cost

  33. IV. 2. Establish Targets • Targets for each program and activity measure • Sources of criteria for setting targets • Historical trends and baselines • Program requirements or intent • Customer expectations or demands • Industry or sector standards • Benchmarking within the organization • Benchmarking outside the organization

  34. Sources of Performance Expectations --The process for identifying expectations and setting targets should be rigorous. --All sources have pros and cons so all should be considered when setting targets. www.AuditorRoles.org

  35. I.V. 2. Examples of Performance Targets and Measures

  36. IV. 3. Base program budgets on unit costs that support desired program outputs and outcomes as reflected in targets • Activity-Based Costing (ABC) • Identify Direct and Indirect Costs

  37. IV. 3. Performance Budgeting • Long-sought “ideal” of budgeting experts: • Performance-driven budgeting. • Best-case reality: • Performance-informed budgeting. www.AuditorRoles.org

  38. V. Performance Measurement and Reporting V. 1. Individual Performance Appraisal V. 2. Organizational Performance Assessment and Reporting V. 3. Performance and Measurement Certification Audits

  39. V. 1. Establishing Accountability Key Points of Business Plan Alignment/ SSPR Integration • Every employee in the organization contributes to the City Vision • Every employee in the department contributes to the Mission of the department. • Every employee in the department contributes to at least one Business Plan Goal. • The Alignment Worksheets show employees how the Services they provide support specific Activities, Programs, and Goals in the Business Plan. • Performance Measures show citizens, City Council and employees how well we are doing. • Every Business Plan Measure must be written into at least one employee’s SSPR. • Every employee, including department executives, will have at least one Business Plan Measure in their SSPR.

  40. V. 1. Individual Performance Appraisal Alignment Worksheet Employee SSPR • Mission Program Objective • Goals • Program – • Activity – • Services that comprise Activity Activity Objective • Activity Performance Measures Activity Results Measure • Results: • Efficiency: • Description of Services • Demand: • Output: Individual Performance Measure • • Same as the Activity Performance Measure • • Part of the Activity Performance Measure or, • • Contributes to the Activity Performance • Measure

  41. Management Program and Levels Performance Goals Performance Indicators Intended Uses Data Component Data Collection Data Processing Analysis Component Measurement of Current Performance Levels Comparison of Current Performance with Criteria (performance goals) Action Component Decisions Concerning Goals Decisions Concerning Programs and Levels Decisions Concerning Monitoring & Evaluation V. 2. What is a Performance Monitoring System?

  42. V. 2. Ensure Performance Measure Definitions/Formulas are Established Design monitoring system to track and analyze the selected measures (efficiency, outputs, and outcomes are essential).

  43. V. 2. Ensure the Results of Performance Measures are Available for Analysis and Decision Making Design a reporting system that is easy to use, accessible to all interested parties, and enables management decisions. 43

  44. V. 2. Establish Performance Reporting “Best Practices” • Design reporting formats and decide frequency of reporting. Austin reports include: • Quarterly Performance Reports • Annual Performance Reports • Community Scorecard

  45. V. 2. Use Performance Reports to Improve Performance • Use performance reports to identify and direct analysis of program performance • Use analysis to identify the causes of inadequate program performance and focus improvements on causes • Use performance reports to identify high performance programs

  46. V. 2.1. City of Austin Performance Report • Departmental Performance Measures • Total of 115 Measures Grouped into Public Safety, Community Services, Infrastructure, and Utilities/Enterprise Departments • Each performance graphic includes: Measure Description, Calculation Method, Results, Assessment of Results, Next Steps, and Contact for More Information

  47. V. 2.1.City of Austin Performance Report • Decisions influenced by: • Stakeholder/citizen priority or demand • Stakeholder/citizen satisfaction • Results shown • City Council and Management priorities

  48. V. 2.2. Why the City of Tallahassee Supports Citizen Centric Reporting • Government officials have a responsibility to be good stewards, to spend monies provided wisely, and to report financial and performance information back to citizens on accomplishments and challenges. • Citizen reporting has its history in Efficient Citizenship Theory • Citizens are not the customers of government, they are the owners of the government • Democratic government is best shaped by the choices of well informed citizens

  49. V. 2.2. Why is the City of Tallahassee Issuing a Citizen Centric Report? • We have a responsibility to inform our citizens about: • What we are responsible for doing • Where the money comes from that runs the City and where it goes • What we have accomplished with monies received and expended , and • What challenges face the City moving forward • We believe informed citizens make for better government

More Related