260 likes | 274 Views
Learn how dashboards with drill-downs, peer comparisons, and significance tests add value, aiding in monitoring institutional performance efficiently. Discover historical trends and benchmark against peers to make informed decisions. Implement cutting-edge dashboard functionalities to elevate decision-making processes. Presented at the SAIR Conference in 2005.
E N D
A Dashboard Report: Value Added Through Drill-Downs, Peer Comparisons, and Significance Tests Mary M. Sapp, Ph.D. Assistant Vice President Office of Planning & Institutional Research University of Miami SAIR, October 24, 2005
Definition of Dashboard A Dashboard is a visual display of Key Performance Indicators (KPIs) presented in a concise, intuitive format that allows decision makers to monitor institutional performance at a glance.
Dashboard Characteristics • Provides visual display of important Key Performance Indicators (KPIs) • Uses concise, intuitive, “at-a-glance” format (uses icons and colors) • Offers high-level summary (reduces voluminous data) Display of “gauges” to monitor key areas
Uses of A Dashboard • Provides quick overview of institutional performance • Monitors progress of institution over time (trends) • Alerts user to problems (colors indicate positive/negative data) • Highlights important trends and/or comparisons with peers • Allows access to supporting analytics when needed to understand KPI results (drill down)
Predecessors & Related Approaches • Executive Information Systems, and their predecessor, Decision Support Systems • On-Line Analytical Processing (OLAP) associated with data warehouses • Balanced scorecards • Key success factors • Benchmarking • Key performance indicators
Why Do A Dashboard? Senior managers • Want to monitor institutional performance • Are very busy—little time to study reports • Value reports that clearly show conclusions • Appreciate overview, with indicators from different areas in one place • Use both trends and peer data “What you measure is what you get.” Robert S. Kaplan and David P. Norton
Impetus for Next Generation Dashboard Report • Session at Winter 2004 HEDS conference • Representatives from 4 HEDS institutions shared dashboard reports • Presentation & discussion prompted ideas about features that might add value • Dashboard report described here developed as result Example of how conference session led to project that would not have been done otherwise
Characteristics of Dashboards Presented at HEDS Conference • All used single page (though some had 2nd page for definitions & instructions) • All presented trend data (changes over 1, 5, 6, and 10 years) • All used up/down arrows, </> icons, or “Up”/“Down” to show direction of trends • Three displayed minima and maxima values for the trend period • Three used colors to show whether trends were positive or negative • One used peer data
Questions Generated by HEDS Dashboards and Proposed Solutions • Concise or detailed? • HEDS: Laments about not being able to provide more detail (“senior administrators should want to see more”) • Reaction: Sympathized with viewpoint, but have learned most senior administrators want summaries, not detail • Next Generation Dashboard: Keep concise format plus links to optional graphs & tables
Issues that Came Up in Discussion at HEDS and UM Solutions • Trends or peer data? • HEDS: All four dashboards used trend data; one also used peer data • Reaction: UM values peer data to support benchmarking • Next Generation Dashboard : Use both peer and trend data
Issues that Came Up in Discussion at HEDS and UM Solutions • When should icon for trend or difference from peers appear? • HEDS: Dashboards seemed to display icons for all non-zero differences • Reaction: Didn’t want small differences to be treated as real changes • Next Generation Dashboard : Use p-values from regression and t-tests to control display of icons for trends and peer comparisons
Issues that Came Up in Discussion at HEDS and UM Solutions • Include minima and maxima? • HEDS: Three displayed minima and maxima over the trend period • Reaction: UM’s senior VP decided too cluttered • Next Generation Dashboard : Shows trends of own institution and 25th and 75th percentiles of peers, with no maxima or minima
Unique Aspects of Next Generation Dashboard Dashboard • Provide drill-down links to graphs and tables for more detail, if desired • Provide peer data in addition to trends • Use regression (rather than maxima and minima) to determine direction of trends • Use statistical significance of slope (rather than just difference) to generate trend icons • Use t-tests to generate peer comparison summary Functions like adding “Global Positioning System (GPS)” to your dashboard
Implementation • Two dashboard reports: student indicators and faculty/financial indicators • 17/21 KPIs on single page • Box for each with current value, arrows to show trends, and text to show relation to peers • Links to more detailed graphs and tables
Indicator Display Upper left corner • Up arrow, down arrow, or horizontal line • Shows direction of UM trend for last 6 years • vs. vs. based on slope of regression & p-values • Color based on desired outcome • Link to graph with trends for UM and 25th & 75th percentiles for peers Current value
Display Upper right corner • Shows relation to 12 peers • Above Peers vs. Below Peers vs. Mid. of Peers based on t-tests (UM vs. mean of peers) • Color based on desired outcome • Link to table with five years of data for UM and peers
Macros Used to display • Data for year chosen • Direction of arrow icons • Color of arrow (green for positive, red for negative, black for neutral)
Spreadsheet • Dashboard developed using Excel spreadsheet, with one sheet for the dashboard report and one sheet for each indicator (graph, peer data, and raw data) • Macro updates year and controls display of arrow icons (direction and color) • Spreadsheet with template for the dashboard and instructions for customizing it shared upon request (leave card)
Indicators Used • Selected with input from the Provost, Vice President for Enrollments, Senior Vice President for Business & Finance, and Treasurer • Mandatory criterion: availability of peer data; sources: • CDS data from U.S. News (Peterson’s/Fiske for earlier years) • IPEDS • National Association of College and University Business Officers • Council for the Aid to Education • National Science Foundation • Moody’s—average A data used instead of individual peers • National academies • See last page of handout for list of indicators used by UM and HEDS institutions
Dashboard Complements Existing Key Success Factors (KSF) Report • Distinction between monitoring “critical” measures (tactical/operational, usually updated on daily, weekly, or monthly basis) and tracking strategic outcomes (key to long-term goals, updated less often) • Both KSF & Dashboard presented to senior administrators in Operations Planning Meeting (KSF bi-monthly and each Dashboard annually)
KSF Monitors Changes for Critical Tactical KPIs • KPIs in KSF usually related to process (e.g., admissions, revenue sources, and expenditures in various categories) • KSF indicators limited to indicators that change on a continuous (e.g., daily, monthly) basis, as captured at the end of each month
Dashboard Monitors Strategic KPIs • KPIs related to effectiveness and quality (student quality and success, faculty characteristics, peer evaluations) • Dashboard KPIs not included in KSF because measured on annual rather than continuous basis • Dashboard KPIs limited to indicators for which peer data available
Future Directions And Adaptations • Adapt Dashboard format for UM’s KSF report • Include targets and significant differences from targets instead of/in addition to peers • Make available online • Link directly to various data sources (e.g., data warehouse) • Apply at the school or department level • Allow individuals to personalize their own databases to include KPIs directly relevant to them
Implementing Next GenerationDashboard at Other Institutions • Session focus is on effective presentation rather than integration of data into report (low-tech spreadsheet, with tables of existing data copied in) • Spreadsheet itself can be used or some of the key concepts can be adapted in other situations • Author will e-mail spreadsheet template and instructions to those interested
Choosing KPIs • Choosing which KPIs to use is critical • Small amount of space, so choose carefully • Appropriateness of KPIs is institution-specific • Critical or strategic focus? • Need to interview key stakeholders to determine what data are important for them • Use different types of KPIs (e.g., quality, process, financial, personnel) to provide balanced perspective
Demo of Dashboard Spreadsheet Copies of spreadsheet available upon request—e-mail pliu@miami.edu