1 / 75

Integrating Performance Measures into University Endeavor

Integrating Performance Measures into University Endeavor. Victor M. H. Borden, Ph.D. Associate Vice President University Planning, Institutional Research, and Accountability (IU) Associate Professor of Psychology (IUPUI). Or. Becoming an Evidence-Driven Learning Organization.

parker
Download Presentation

Integrating Performance Measures into University Endeavor

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrating Performance Measures into University Endeavor Victor M. H. Borden, Ph.D. Associate Vice President University Planning, Institutional Research, and Accountability (IU) Associate Professor of Psychology (IUPUI)

  2. Or Becoming an Evidence-Driven Learning Organization Victor M. H. Borden, Ph.D. Associate Vice President University Planning, Institutional Research, and Accountability (IU) Associate Professor of Psychology (IUPUI)

  3. Or How I Learned to Stop Worrying and Love Performance Measures Victor M. H. Borden, Ph.D. Associate Vice President University Planning, Institutional Research, and Accountability (IU) Associate Professor of Psychology (IUPUI)

  4. If this were a simple matter, you would have figured it out long ago and I wouldn’t be here. Do not expect my explanations to be simple nor my advice to be straightforward. This will be more like a graduate-level seminar than an introductory course

  5. I realize that I will not succeed in answering all of your questions. Indeed, I will not answer any of them completely. The answers I provide will only serve to raise a whole new set of questions that lead to more problems, some of which you weren’t aware of in the first place. When my work is complete, you will be as confused as ever, but hopefully, you will be confused on a higher level and about more important things The Institutional Research Credo

  6. Why Not “Data-Driven?” • Data, per se, are not what we need

  7. If Not Data-Driven, Then What? • Evidence-based practice to decide… • What to do • How best to do it • If it is working as desired • So that we can learn from what we do and improve • We want to be part of a Learning Organization

  8. Learning Organizations • …organizations where people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning to see the whole together. (Senge, 1990)

  9. Learning Organizations • …are characterized by total employee involvement in a process of collaboratively conducted, collectively accountable change directed towards shared values or principles. (Watkins and Marsick 1992)

  10. Overview • Lessons I’ve learned (the hard way) about developing university performance measures • Performance measures as the “tip of the evidence-based iceberg” • Going below the surface • Applying an organizational learning lens • Some implications and related thoughts

  11. Lessons Learned • Early lessons on measurement theory • 1994 NDIR Volume • Measuring Institutional Performance Outcomes (APQC-MIPO) • Developing campus PIs to link planning, budgeting, evaluation and improvement • Taking it to the next level

  12. Measurement Theory Inductive – Deductive Cycle

  13. Measurement Theory • Validity • Warranted assertion (Dewey) • Degree to which the measure accurately represents the concept (what you are attempting to measure) • Size of a person (weight, height, circumference, body mass) • Quality of instruction (course ratings, peer review, student learning)? • Reliability • Degree to which measure faithfully represents the concept • Course ratings taken mid-term/end-term • Unless very careful attention is paid to one’s theoretical assumptions and conceptual apparatus, no array of statistical techniques will suffice – Blalock, 1982

  14. 1994 NDIR Volume • Using Performance Indicators to Guide Strategic Decision Making (Borden and Banta, Eds.)

  15. Lessons • Borden and Bottrill: Where you stand on PIs depends on where you sit • Ewell and Jones: Think before you count • Joengblood and Westerheijden (Europe): PIs out, Quality Assurance in • Dorris and Teeter (TQM): PIs are fine, if P stands for Process • Dolence and Norris: KPIs are the fuel of a strategic decision engine • DeHayes and Lovrinic (ABC): Show me the money…and what you use it for doing.

  16. Lessons (continued) • Banta and Borden Criteria for Effective PIs • Start with purpose • Align throughout organization • Align across input, process, output • Coordinate a variety of methods • Use in decision making

  17. Measuring Institutional Performance Outcomes • An American Productivity and Quality Center (APQC) benchmarking study

  18. APQC MIPO Findings • The best institutional performance measures communicate the institution’s core values • Good institutional performance measures are carefully chosen, reviewed frequently, and point to action to be taken on results • External requirements and pressures can be extremely useful as starting points for developing institutional performance measurement systems • Performance measures are best used as “problem detectors” to identify areas for management attention and further exploration • Clear linkages between performance measures and resource allocation are critical, but the best linkages are indirect (and non-punitive)

  19. MIPO Cont. • Performance measures must be publicly available, visible, and consistent across the organization • Performance measures are best considered in the context of a wider transformation of organizational culture • Organizational cultures supportive of performance measures take time to develop, require considerable “socialization” of the organization’s members, and are enhanced by stable leadership • Performance measures change the role of managers and the ways in which they manage

  20. MIPO – Boiling it Down • You cannot ‘lead’ with performance measures • Performance measures emerge from a broader culture of evidence, that is, they are part of something bigger

  21. www.iport.iupui.edu E.G.: PIs@IUPUI

  22. Taking it to the Next Level:Accountability at Indiana University Articulating and Attaining Strategic Goals and Objectives

  23. Audiences • Board of Trustees • Most comprehensive, University-wide view • Campus accreditors and (prospective) partners • Campus-specific objectives and indicators • Targeted packaging for… • Media; legislators; alumni; current and prospective students and their parents; research agencies and collaborators

  24. Purposes • Position IU strategically • Improve the effectiveness and quality of programs and services • Provide a common framework to align efforts across campuses • Communicate a clear and consistent message about IU’s broad goals • Enhance IU’s image • Define and document IU’s contributions to the state, students, and communities • Demonstrate integrity in accounting for the use of public and private resources

  25. Principles • Mission-centered • Research-driven • Transparency • Inclusive dimensions of excellence and quality • Empowerment and responsibility • Influenced by “best practices” • National Commission on Accountability in Higher Education

  26. Framework • University-wide strategic goals and core performance indicators • Campus performance objectives and indicators derived from mission, aligned to university goals and core indicators • Explicit link to administrative area goals and objectives • Annual performance reports and reviews • University and campuses

  27. Limitations of Measures/Metrics • Inherently imperfect • Overly simplistic • Not everything that counts can be counted, and not everything that can be counted counts – Albert Einstein

  28. Accommodating the Limitations • An imprecise answer to the right question is much better than a precise answer to the wrong question (paraphrasing John Tukey) • Triangulation • Using multiple, convergent measures to better reflect the underlying • Performance measures as the tip of the evidence-based iceberg

  29. Performance Measures as the Tip of the Evidence-Based Practice Iceberg Performance measures Evidence Based Practice Vertical (hierarchical) alignment Plan Improve Implement Assess Horizontal (cross-unit) alignment

  30. Evidence-Based Practice • Commonly used in clinical domain • Validity derived from rigorous research conducted by others and believed to generalize to other settings • For university endeavor there are limits to generalizability across settings • Focus shifts to more continuous use of process-generated data using less rigorous methods to monitor, reflect, and adjust

  31. Methods of Evidence-Based Practice • The many faces of evidence-based practice • Student learning outcomes assessment • Program evaluation • Program review • Quality improvement • Balanced score card • Benchmarking • The role of collaborative inquiry

  32. On to something else Back to the drawing board Adapted from Norman Jackson The Evaluation Cycle 1. THINK ABOUT ISSUES 2. ENGAGE WITH THE PROBLEM 3. DEVELOP RESOURCES/ STRATEGIES TO IMPROVE 6. PLAN TO IMPROVE 4. IMPLEMENT INTERVENTIONS * experiment 5. EVALUATE IMPACT * did it work as I intended? * how did people respond? * what were the results?

  33. The Assessment Matrix

  34. The Support Unit Matrix

  35. Quality Improvement Models • Advantages • Focus on process provides best chances for identifying points of improvement • Collaborative teams empower staff and help improve communication across units • Formulaic method and external staff support help guide and keep on track • Sample methods • Penn State’s Fast Track • U of Wisconsin Accelerated Improvement

  36. PSU Fast Track

  37. UWisc Accelerated Improvement http://www.wisc.edu/improve/improvement/accel.html

  38. Program Review • Program self-study, site visit by “peers” • Common method for academic programs • Increasing use for administrative programs • Fits well with accreditation framework • Guidelines shape tone and tenor • Content standards • Review team composition • Flexibility accommodates range of inquiry orientations

  39. Limits of Program Review • Expensive and time-consuming • Can be done with little participation • Or with a lot • Results not always directly useful for change • Memorandum of understanding helpful • Episodic nature not responsive to changing environment

  40. Balanced Score Card (BSC) • Kaplan & Norton propose business model • Financial performance • Customer service and satisfaction • Process effectiveness and efficiency • Organizational learning

  41. BSC in Higher Education • Ruben (1999) • Teaching/Learning • Programs/Courses, Student Outcomes • Service/Outreach • University, profession, alumns, state, prospective students, families employers • Scholarship/Research • Productivity/Impact • Workplace satisfaction • Faculty/staff • Financial • Revenues/expenditures

More Related