The role of leadership in performance management
Download
1 / 71

The Role of Leadership in Performance Management - PowerPoint PPT Presentation

The Role of Leadership in Performance Management Donald P. Moynihan, La Follette School of Public Affairs, University of Wisconsin-Madison Presentation to Chicago Federal Leadership Forum Have you encountered? Strategic planning Performance measures Performance contracts

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha

Download Presentation

The Role of Leadership in Performance Management

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


The role of leadership in performance management l.jpg

The Role of Leadership in Performance Management

Donald P. Moynihan,

La Follette School of Public Affairs,

University of Wisconsin-Madison

Presentation to Chicago Federal Leadership Forum


Have you encountered l.jpg

Have you encountered?

  • Strategic planning

  • Performance measures

  • Performance contracts

  • Pay for performance


The role of leadership l.jpg

The role of leadership

  • During my 20 years in the private sector as a CEO and advisor to CEOs, I found that leadership, measurement, and a motivated workforce create the foundation for good performance. I am confident that the same is true in government

    • Jeff Zients – Chief Performance Officer, 2009


Outline l.jpg

Outline

  • Defining terms

  • Era of governance by performance management

  • From Bush to Obama

  • How do we use performance systems?

  • What fosters use of performance data?

  • Summary points


Defining terms l.jpg

Defining terms


Performance management l.jpg

Performance management

  • A system that generates performance information through strategic planning and performance measurement routines, and connects this information to decision venues,


Performance regimes l.jpg

Performance regimes

  • Performance tools create unprecedented pressure on public actors to perform, in a context where performance is defined by quantitative indicators


Purposes of performance information l.jpg

Purposes of Performance Information

  • Promote: How can I convince political actors, stakeholders and the public that my agency is doing a good job?

  • Celebrate: What accomplishments are worthy of the important ritual of celebrating success?

  • Learn: What is what working or not working?

  • Improve: What exactly should who do differently to improve performance?


Purposes of performance information9 l.jpg

Purposes of Performance Information

  • Evaluate: how well is my agency performing?

  • Control: how can I ensure that my subordinates are doing the right thing?

  • Budget: on what program’s, people, or projects should my agency spend the public’s money?

  • Motivate: how can I motivate employees and collaborators to improve performance?


Era of governance by performance management l.jpg

ERA of governance by performance management


Era of governance by performance management11 l.jpg

Era of Governance by Performance Management

  • The rise of a doctrine

  • Not new, but more influential than before

  • Must justify actions in terms of outputs and outcomes

  • Basis for holding new structural forms accountable


Doctrinal logic for change l.jpg

Doctrinal logic for change


Government performance and results act 1993 l.jpg

Government Performance and Results Act 1993

  • Mandated:

    • 5 year strategic plans, updated every 3 years

    • Specific goals and objectives

    • Annual performance reviews and plans


From bush to obama l.jpg

From Bush to Obama


Bush approach l.jpg

Bush approach

  • Presidents Management Agenda

    “everyone agrees that scarce federal resources should be allocated to programs that deliver results”

  • Wanted to integrate performance data into budget process


Congressional justifications l.jpg

Congressional Justifications

  • Center around performance goals

  • Pushback from Appropriations Committees

    • Veteran’s Administration told; “to refrain from incorporating ‘performance-based’ budget documents”; later told: “If the Department wishes to continue the wasteful practice of submitting a budget structure that will not serve the needs of the Congress, the Congress has little choice but to reject that structure and continue providing appropriations that serve its purposes.”

  • Two budgets required


Congressional justifications17 l.jpg

Congressional Justifications

  • Department of Transportation told: “agencies are directed to refrain from including substantial amounts of performance data within the budget justifications themselves, and to instead revert to the traditional funding information previously provided. Performance-related information may be submitted under separate cover.”

  • Negative consequences were promised for agencies that ignored this directive: “If the Office of Management and Budget or individual agencies do not heed the Committee’s direction, the Committee will assume that individual budget offices have excess resources that can be applied to other, more critical missions.”


Program assessment rating tool part l.jpg

Program Assessment Rating Tool (PART)

  • 5 year summary by OMB of evidence on program performance for 1016 programs

    • 18 percent are Effective

    • 31 percent are Moderately Effective

    • 29 percent are Adequate

    • 3 percent are Ineffective

    • 19 percent are Results Not Demonstrated


Part as evidence based dialogue l.jpg

PART as Evidence-based Dialogue

  • Third-party program review with a clear opinion

  • Greater emphasis on performance

  • The standard of proof for program performance can only be satisfied by positive evidence of results

  • The burden of proof for performance rests on agencies

  • Entire programs are evaluated on a regular basis

  • The routine nature of PART creates an incentive to engage


Obama a pragmatic approach l.jpg

Obama: A Pragmatic approach

  • “The question we ask today is not whether our government is too big or too small, but whether it works -- whether it helps families find jobs at a decent wage, care they can afford, a retirement that is dignified. Where the answer is yes, we intend to move forward. Where the answer is no, programs will end. And those of us who manage the public's dollars will be held to account, to spend wisely, reform bad habits, and do our business in the light of day, because only then can we restore the vital trust between a people and their government”


Example pedometer challenge l.jpg

Example: Pedometer challenge!

  • Voluntary

  • Belief that transparent performance numbers will change behavior, create a sense of competition and raise performance


Early evidence on obama l.jpg

Early evidence on Obama

  • Performance measurement will be important

    • “The President is creating a focused team within the White House that will work with agency leaders and the OMB to improve the results and outcomes for Federal Government programs while eliminating waste and inefficiency”

    • Chief performance officer

    • Continue to maintain agency level performance positions


What happens to part l.jpg

What happens to PART?

  • Not clear

  • Criticized as ideological, as too broad, as a data collection exercise

  • Analysis remains in place, but new PARTs have not started

  • OMB have offered agencies funds for better evaluations


New emphasis on leadership l.jpg

New emphasis on leadership

  • Focusing leaders on what matters – key goals

  • Accelerating results – Performance Improvement Council; data driven meetings

  • Style: focused collaboration


New focus on information use l.jpg

New focus on information use

  • Will be a central aspect of the Obama administration’s performance initiatives

  • Jeff Zients: “The ultimate test of our performance management efforts is whether or not the information is used”

  • Shelly Metzenbaum: “the key performance management challenge facing the Obama administration is to use—not just produce—performance goals and measures”


How do we use performance systems l.jpg

How do we use performance systems?


Why care about use l.jpg

Why care about use?

  • For reforms to succeed, implies that data is used

  • Provides a tractable means of studying the impact of results-based reform

  • Public organizations have devoted significant time and resources into creating routines to collect and disseminate data

  • Almost no attention to creating routines of use

  • How do you use performance data?


Types of responses 4 ps l.jpg

Types of responses: 4 Ps

  • Passive

  • Perverse

  • Political

  • Purposeful


Passive use of data l.jpg

Passive use of data

  • Passive:

    • Do the minimum to comply with requirements

    • Do not actually use data

    • Correlated with cynicism about reforms


Perverse use of data l.jpg

Perverse use of data

  • Effort Substitution: Reducing effort on non-measured dimensions

  • Cherry picking/Cream-skimming: Focusing effort on subgroups of clients most likely to provide greatest impact on performance measures while effectively denying services to others.

  • Measure selection: Selecting metrics or data to measure that will offer the most favorable portrayal of a service

  • Hiding numbers: Declining to present performance measures that may exist


Perverse use of data31 l.jpg

Perverse use of data

  • Output distortion: Manipulating measurement processes to improve measured performance.

  • Ratchet effects: Curbing productivity in one time period to avoid the setting of more challenging targets in another.

  • Churning: Frequently adopting different targets or measures to prevent comparison across time.

  • Cheating: Simply making up numbers, though rare, does occur.


Responding to perversity l.jpg

Responding to perversity

  • Add new/additional measures

  • Change existing measures

  • Rely/cultivate intrinsic norms to limit misbehavior

  • Avoid high-powered incentives


Political uses of data l.jpg

Political uses of data

  • Process of selecting measures means shaping a program narrative

  • “Understand that measuring policy is not a science. It is an art. It is words, and pictures and numbers. And you create impressions, beliefs, understandings and persuasions.”


Political uses of data34 l.jpg

Political uses of data

  • Data tells us what happened

  • Program officials still need to interpret and explain:

    • why performance did or did not occur;

    • the context of performance;

    • how implementation occurred;

    • an understanding of outside influences on performance; and

    • how to choose which program measure is a priority.

  • Exploit ambiguity and subjectivity of data


Political ambiguity of data l.jpg

Political: Ambiguity of data

  • Examine same programs, but disagree on data

  • Agree on data, but disagree on meaning

  • Agree on meaning, but not on next action steps/resources


Political subjectivity of data l.jpg

Political: Subjectivity of data

  • Actors will select and interpret performance information consistent with institutional values and purposes


Evidence of ambiguity in part l.jpg

Evidence of Ambiguity in PART

  • Ambiguity of terms:

    • E.g.: Program purpose, quality evaluation, ambitious, having made progress

  • How to interpret results? Multiple logics from experiment:

    • Argue that ratings are unreliable

    • Cut poorly managed programs

    • Raise funding for programs with positive assessments

    • Parity: Raise funding because program with similiar assessment received more

    • Delay cuts because progress being made

    • Clear relationship between resources, need and program delivery

    • Stakeholder and congressional views


Evidence of subjectivity with part l.jpg

Evidence of Subjectivity with PART

  • OMB using PART to expand influence in performance management/policy

    • OMB can define programs, goals, measures, agency responsibility

  • Disagreement with agencies/Congress on meaning/relevance of PART

  • Experimental evidence:

    • UW students significantly more likely to disagree with OMB, and to argue for higher assessments and resources


Implications for decisionmaking l.jpg

Implications for Decisionmaking

  • Performance information use reflects political process, does not replace it

  • Performance information use does not lead to clarity

  • Ability to structure dialogue tied to power


Purposeful use of data l.jpg

Purposeful use of data

  • Use data to improve program performance

  • Goal-based learning

    • efficiency improvements

    • better targeting of resources

    • more informed strategic decisions,

    • tying indicators to rewards/sanctions in contract arrangements


Purposeful use of data41 l.jpg

Purposeful use of data

  • Use of performance information for problem-solving more likely to occur in intra-institutional settings

    • Reduces competing interpretations

  • Problem of neglect

    • rarely do anything with information


Learning forums l.jpg

Learning forums

  • Routines specifically focused on solution-seeking, where actors collectively examine information, consider its significance and decide how it will affect future action

  • What measures are useful for agency officials?

  • What other ways can we encourage learning forums?


What fosters performance information use l.jpg

What fosters performance information use?


The right context l.jpg

The Right Context

  • Simple function that is easy to measure

  • Clear link between measures of actions, and measures of outcomes

  • One-dimensional – relatively few measures that do not conflict with one another

  • Stakeholder support – clear agreement about purpose


Other factors l.jpg

Other factors

  • Learning forums

  • Mission-based culture/supportive culture

  • Resources

  • Administrative stability

  • Administrative capacity


Quantitative approach l.jpg

Quantitative approach

  • 3 studies using survey-based data

  • Self-reported performance information use

  • Results from Moynihan and Pandey (in press) and Moynihan, Wright and Pandey (2009; 2010)


Slide47 l.jpg

Study 1: Ordinal regression of reported performance information use for decisions


Intrinsic vs extrinsic motivation l.jpg

Intrinsic vs. extrinsic motivation

  • Sense of public service motivation mattered

  • Possibility of extrinsic reward did not create an incentive to use data

  • Implication: performance information use as extra role behavior


Organizational factors l.jpg

Organizational factors

  • Information availability

    • Supply-side approach

    • Use increases with better information, and when information is tied to management systems


Organizational factors50 l.jpg

Organizational factors

  • Demand side approach

    • Culture matters

      • Previous work focuses on whether culture welcomed performance management reforms

      • What about broader measures of culture?

      • Developmental culture (adaptability, readiness, growth)

    • Flexibility – unlikely to use data if cannot apply insights


Specialist vs generalist leaders l.jpg

Specialist vs. generalist leaders

  • Task-specific knowledge provides context in which to interpret and apply data

  • Leadership role

    • Task-specific leaders more likely to use data than generalist leaders


Other evidence of leadership l.jpg

Other evidence of leadership

  • Support/commitment

  • Provision of resources

  • Participation

  • What other ways can leadership matter?


Study 2 transformational leadership l.jpg

Study 2: Transformational leadership

  • Approach to leadership consistent with performance:

    • Articulate an appealing vision of the organization’s mission and future

    • Model behavior consistent with vision, inspiring role model

    • Challenge old assumptions


Propositions l.jpg

Propositions

  • Transformational leadership behaviors will have an indirect, positive effect on performance information use through its influence on goal clarity

  • Transformational leadership behaviors will have an indirect, positive effect on performance information use through its influence on organizational culture.


Key measures l.jpg

Key measures

  • Transformational leadership

    • Asked department heads/assistant city managers on extent to which city manager demonstrates transformational leadership:

    • articulates his/her vision of the future.

    • leads by setting a good example

    • challenges me to think about old problems in new ways

    • says things that make employees proud to be part of the organization.

    • as a clear sense of where our organization should be in five years.

    • Aggregated responses by organization


Structural equation model l.jpg

Structural Equation Model


Implications l.jpg

Implications

  • Leadership and management

  • Indirect effects are important

  • “Setting the table” as long-term leadership strategy


Study 3 perceived social impact l.jpg

Study 3: Perceived social impact

  • Individuals who see their work as helping others more likely to use performance information

  • Some evidence that individuals who perceive greater social impact are more motivated

  • Why should it relate to performance information use?


Key measures59 l.jpg

Key measures

  • Perceived social impact

    • I feel that my work makes a positive difference in other people’s lives.

    • I am very aware of the ways in which my work is benefiting others.

    • I am very conscious of the positive impact my work has on others.

    • I have a positive impact on others in my work on a regular basis


Purposeful and political use l.jpg

Purposeful and political use

Purposeful

  • I regularly use performance information to make decisions.

  • I use performance information to think of new approaches for doing old things.

  • I use performance information to set priorities.

  • I use performance information to identify problems that need attention.

    Political

  • I use performance information to communicate program successes to stakeholders.

  • I use performance information to advocate for resources to support program needs.

  • I use performance information to explain the value of the program to the public.


Study 4 experimental approach l.jpg

Study 4: Experimental approach

  • How does performance information matter to decisions?

  • How does the framing of performance information affect decisions?

  • Respondents given surveys with scenario – make budget recommendations

  • Series of vignettes for different programs

  • Half vignettes are control, half are treatment


Theoretical background l.jpg

Theoretical background

  • Research on decision frames from psychology and behavioral economics

  • Performance information is strategically selected and presented – does this work?


Does the addition of performance data matter l.jpg

Does the Addition of Performance Data Matter?

  • Control: no data; treatment: addition of data without clear correlation to resources

  • The Department of Land and Water Resources is responsible for monitoring and maintaining the water quality of lakes in the county, including two major lakes that are popular for swimming and other water sports during the summer. Estimates of water quality are based on pH levels, pesticides, nitrates and other chemicals in the water.


Does the addition of performance data matter65 l.jpg

Does the Addition of Performance Data Matter?

  • Control: no data; treatment: addition of data with clear relationship to resources

  • The Department of Social Services delivers a program called the Home Downpayment Initiative. Using a mix of federal, state, and local resources, the program seeks to increase the homeownership rates among low-income and minority initiatives. To do so, it provides financial assistance to first-time homebuyers for downpayment and closing costs.


Is outcome data more powerful than output l.jpg

Is outcome data more powerful than output?

  • Control: output data; treatment: outcome data

  • The Department of Health Services offers a program called Health Check, which is a preventive health check-up program made available for anyone under the age of 21 who is currently enrolled in Medicaid. Health Check provides a head-to-toe medical exam, immunizations, eye exam, lab tests, growth and development check, hearing check, nutrition check, and teen pregnancy services. The goal of the program is to prevent the incidence of more serious and more expensive health situations.


Threshold effects l.jpg

Threshold effects

  • Treatment: performance data pass a memorable threshold (200)

    The County Tourism Board seeks to increase visits from those who live outside the county, and to increase the use of recreational and cultural opportunities by both locals and outsiders. It collects data from local hotels, restaurants, and other businesses that depend on tourists. In the last number of years, the number of tourists visiting the county has stayed relatively flat at about 100,000, and the Board has focused its marketing budget on “quality, not quantity,” by increasing the dollar amount that each tourist spends.


Including equity measures l.jpg

Including equity measures

  • Treatment: addition of equity measure that aligns with mission

  • The Department of Social Services funds the Early Intervention Program, which provides services for children three and under with developmental delays and disabilities. The mission statement for the Early Intervention Program is: “Our mission is to provide access to therapies that improve child developmental outcomes.” The program is administered by a non-profit, and employs therapists to work with children and families in the home environment.


Summary points what to do l.jpg

Summary points: what to do

  • Move beyond passive and limit perverse use

  • Focus on political use

    • What is the narrative of your program?

    • What goals are meaningful and telling? How do they relate to the narrative?

    • What goals are essential to explaining program purpose and achievement?

    • How do you frame and communicate measures? Who is your audience?


Summary points what to do70 l.jpg

Summary points: what to do

  • Focus on purposeful use

    • Provide resources, be involved, make clear that it is important

    • Encourage right context for use

      • Foster goal clarity

      • Encourage supportive culture

    • Create and support learning forums

    • Appeal to intrinsic motivation

      • Focus on demonstrating significance of measures


Questions comments l.jpg

Questions/Comments

  • dmoynihan@lafollette.wisc.edu

  • http://www.lafollette.wisc.edu/facultystaff/moynihan-donald.html

  • The Dynamics of Performance Management

    • Georgetown University Press


ad
  • Login