1 / 53

Performance Management Presentation

Performance Management Presentation. Team Members: Ronald Wilson, Team Leader Team Members: Gerald Hines, Fred Khoshbin, Cyrena Simons ORF National Institutes of Health January 15, 2004. Table of Contents. Main Presentation PMP Template ……………………………….…………………………………..

dhector
Download Presentation

Performance Management Presentation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Performance Management Presentation Team Members: Ronald Wilson, Team Leader Team Members: Gerald Hines, Fred Khoshbin, Cyrena Simons ORF National Institutes of Health January 15, 2004

  2. Table of Contents Main Presentation PMP Template ……………………………….………………………………….. Customer Perspective……………………….………………………………. Internal Business Process Perspective………………………………. Learning and Growth Perspective……………………………………… Financial Perspective…………………………………………………………. Conclusions and Recommendations…………………………………… Appendix…………………………………………………………………………….

  3. Relationship Among Performance Objectives

  4. Customer Perspective

  5. Customer Perspective (cont.)

  6. Customer Survey Results In the winter of 2002-2003, a customer survey of Institute Directors and Executive Officers was undertaken to gauge customer satisfaction with the services of the then Office of Facilities Planning. • Customers were asked to rate services provided by Office of Facilities Planning according to a planning continuum: long-, medium-, and short-range. • Customer service targets for FY03-FY05 illustrated in slides that follow have been derived from scores attained on this customer survey. • Complete survey results exclusive of full text comments can be found in the Appendix.

  7. Timeliness Summary FY03 Customer Satisfaction Ratings of Facilities Planning Services Outstanding Unsatisfactory N = 21

  8. Timeliness Long-Range Planning Service Ratings by FY Mean Ratings M = 7.14 M = 7.13 M = 7.57 Unsatisfactory Outstanding N = 12 N = 21

  9. Timeliness Mid-Range Planning Service Ratings by FY Mean Ratings M = 7.53 M = 7.47 M = 7.56 Unsatisfactory Outstanding N = 12 N = 21

  10. Short-Range Planning Service Ratings by FY Mean Ratings M = 7.43 Timeliness M = 6.91 M = 7.27 Unsatisfactory Outstanding N = 21 N = 12

  11. Timeliness FY03 Long-Range Planning Service Ratings by Position Mean Ratings Unsatisfactory Outstanding Note: Differences are not statistically significant. N = 6 N = 11

  12. Timeliness FY03 Mid-Range Planning Service Ratings by Position Mean Ratings Outstanding Unsatisfactory Note: Differences are not statistically significant. N = 6 N = 11

  13. Timeliness FY03 Short-Range Planning Service Ratings by Position Mean Ratings Outstanding Unsatisfactory Note: Differences are not statistically significant. N = 6 N = 11

  14. Summary • Three-fourths of respondents were from the Bethesda campus • The majority of the respondents were EOs • Highest Long-Range Planning ratings of satisfaction for responsiveness, availability, and competence • Lowest ratings for handling of problems and quality • Highest Mid-Range Planning ratings of satisfaction for availability,competence, and responsiveness • Lowest ratings for handling of problems and timeliness • Highest Short-Range Planning ratings of satisfaction for competence,quality, responsiveness and availability • Lowest ratings for handling of problems and timeliness

  15. Summary • Top three FY03 ratings of satisfaction for all phases of planning were competence, responsiveness, and availability • Lowest FY03 ratings of satisfaction for all phases of planning were in the area of handling of problems • FY03 Long-Range Planning ratings lower than the other phases • Comparison of ratings between ICDs/SDs and EOs did not indicate huge differences in perceptions • EO perceptions more positive than ICDs/SDs • Differences are NOT statistically significant

  16. Summary • Comments indicate: • Need clearer definition of role responsibilities in ORS/ORF • More coordination among DFP/DES/Leasing • Seamless process from planning to actual space acquisition • There are perceptions of inequity among ICs • ORS/ORF needs to develop a better understanding of billing procedures and rules within the ICs

  17. Customer PerspectivePlanned Actions The Division of Facilities Planning is currently surveying its customers for FY04 as part of the annual Buildings and Space Planning process. It has modified last year’s customer survey to incorporate the appropriate discrete services for which it is responsible and revised the survey questions to better align with customer objectives.

  18. Internal Business Process Perspective

  19. Internal Business Process Perspective IB2: # of meetings attended and number of issues identified and strategies that resulted from proactive contacts with customers, by source and type of meetings.

  20. Internal Business Process PerspectivePlanned Actions In the coming year, the Division will: • Construct a planning milestone chart to track its processes • Develop a planning and programming tool to assist NIH in assessing project feasibility at the early stages of project development • Develop a tracking mechanism in the SJD log to track the progress of SJDs from the original request to the initial presentation of the request to the SRB

  21. Learning and Growth Perspective

  22. Learning and Growth Perspective

  23. Learning and Growth PerspectiveActions Planned and Underway Objective LG 1: A training needs assessment has been conducted and each staff member is developing a training plan. Objective LG2: A benchmarking survey is underway. In order to identify best practices, the Division has invited 9 federal installations within the Capital Region and/or DHHS, as well as 8 research universities and one hospital system to participate in a benchmarking project. To date, from among this group, 9 planning officers have confirmed their willingness to participate. Survey responses are expected by the end of January 2004. Follow up plans include interviews with some of the participants, and another survey next year to (1) include other types of organizations, and (2) expand the topics for benchmarking. Objective LG3, Part 1: Working with IT, the Division is developing a tracking tool for its customers to measure the frequency and effectiveness of its meetings and the success of its outreach efforts. Objective LG3, Part 2: Actions the same as for Objective LG2. Objective LG4: As part of its monthly Information Exchange, the Division is dedicating time to identifying its technology needs. Four Information Exchanges held last year.

  24. Financial Perspective

  25. Financial Perspective (cont.)Mandatory to Report on Unit Cost Objective

  26. Financial PerspectiveActual costs for discrete services

  27. Financial PerspectivePlanned Actions • The Division will reexamine its unit cost measures and establish new baseline unit costs from FY02 and FY03 actual cost data. FY04 costs will be monitored quarterly when the new CAN system is implemented.

  28. Conclusions

  29. Conclusions from PMP • The Balanced Scorecard approach has helped DFP better relate our day-to-day planning activities to long-term ORF and DFP planning goals. • Also, the process has helped highlight how important satisfying customer requirements (some of which customers may be unaware of) is to the work of the Division. Getting in front of customer’s needs, and thus guiding them through options before they are caught short and left with no alternatives has gained more appreciation. The process has resulted in a greater recognition of the value of working with our customers early in the facility and space planning processes enabling us to spot problems well in advance, and help customers plan through to a solution(s) that meets their needs and those of NIH. • Working through the Learning and Growth perspective pushed our thinking outward to begin to focus on essential technical support and training needs. It’s doubtful we would have pushed as far without the PMP process. • In summary, the process has helped sharpen our insights, deepened our understanding of what we are all about, and focused us on what we need to do to achieve the Division’s and ORF’s strategic goals. • The major initiatives for FY04 will be to revise the Customer Survey to track the Division’s progress in addressing prior year customer concerns and completing the benchmarking survey.

  30. Appendix: Measures Customer Perspective C1: Overall average rating for Customized ORS Customer Scorecard C2: Percent of Customized Scorecard survey respondents indicating satisfaction with the consistency and reliability of service received from the service group C3: Percent of Customized Scorecard survey respondents indicating satisfaction with access to the planning process regardless of outcome of their individual request C4: Percent of Customized Scorecard survey respondents indicating service group anticipated their needs and assisted them in incorporating respondents facilities issues into the planning process Internal Business Process Perspective IB1: A. Project Reviews--% of submittal deadlines met, such as:- Program of Requirements- Pre-Programming Documents- Design Drawings- Environmental ReviewsB. DFP Planning Studies--% of planning milestones met- Master Plans- Strategic Facilities Plan- Site Feasibility StudiesC. Short-term planning requests- SJDs--% of planning milestones met- Site Selection Requests--% of planning milestones met

  31. Appendix (cont’d.) Learning and Growth Perspective L1a: % of employees with current and pertinent training plansL1b: % of training plans that are fully executedL2: % of benchmarking plan completedL3: % of planning communication tools in place and "best practices" adoptedL4: Number of information exchanges attended by 50% or more of staff Financial Perspective F1: DFP unit cost for master planning services per planning activity F2: DFP unit cost of B&S planning services per B&S Plan report F3: DFP unit cost to manage SJD process per square foot of space requested

  32. Appendix Results from the FY03 ORS Customer Scorecard for NIH facilities Planning Prepared by Amy Culbertson and Joe Wolski Office of Quality Management 16 April 2003

  33. Methodology • OQM contacted by OFP early in FY03 to discuss customer assessment methodology • Desire to establish system that is integral component of annual Building and Space planning meetings with ICs • Cycle of meetings typically occur early in the FY • Discussed concern that had just completed customer survey in Sept FY02 • Designed new FY03 survey to address three components of planning process • Long-range • Medium range • Short range • Modified the administration process based on learnings from Sept FY02 process • Carefully tracked who responded to survey • Sent follow-up emails to increase response rate

  34. Methodology • Surveys administered via email in December 2002 - January 2003 time frame • Sent to IC Directors (ICDs), Scientific Directors (SDs), and Executive Officers (EOs) • Comments received back from some EOs indicated displeasure at survey going to ICD/SD • Reminder sent in late January • Received substantial number of surveys after reminder was sent • Gathered, tracked, entered, and analyzed data • Integrated responses from Sept 02 survey as appropriate

  35. Survey Distribution FY03 Administration Number of surveys distributed 70 Number of respondents 21 Response Rate 30% Number of IC’s receiving survey 27 Number of IC’s with at least one response 14 IC Response Rate 52% FY02 Administration Number of surveys distributed 85 Number of respondents 12 Response Rate 18%

  36. FY03 Respondents by Location N = 20 Note: Multiple responses allowed. 1 respondent skipped this question.

  37. FY03 Respondents by Primary Mission N = 20 Note: 1 respondent skipped this question.

  38. FY03 Respondents by Position N = 21 Note: Feedback from respondents indicated that 6 surveys were filled out by the EO “on behalf of” the IC Director.

  39. Summary FY03 Customer Satisfaction Ratings of Facilities Planning Services Outstanding Unsatisfactory N = 21

  40. Long-Range Planning Service Ratings by FY Mean Ratings M = 7.14 M = 7.13 M = 7.57 Unsatisfactory Outstanding N = 21 N = 12

  41. Mid-Range Planning Service Ratings by FY Mean Ratings M = 7.53 M = 7.47 M = 7.56 Outstanding Unsatisfactory N = 21 N = 12

  42. Short-Range Planning Service Ratings by FY Mean Ratings M = 7.43 M = 6.91 M = 7.27 Unsatisfactory Outstanding N = 21 N = 12

  43. FY03 Long-Range Planning Service Ratings by Position Mean Ratings Unsatisfactory Outstanding Note: Differences are not statistically significant. N = 6 N = 11

  44. FY03 Mid-Range Planning Service Ratings by Position Mean Ratings Outstanding Unsatisfactory Note: Differences are not statistically significant. N = 6 N = 11

  45. FY03 Short-Range Planning Service Ratings by Position Mean Ratings Unsatisfactory Outstanding Note: Differences are not statistically significant. N = 6 N = 11

  46. Do the facilities planning services in ORS support your Institute’s mission planning efforts? N = 21 N = 12

  47. Do you understand how to get your Institute’s needs into the strategic facilities planning process? N = 21 N = 12

  48. Does the current process work effectively for your Institute to acquire the space you need? N = 21 N = 12

  49. Themes from Comments on Suggestions to Improve the Facilities Planning Process • Need clear definition of role responsibilities in ORS and seamless service from planning to acquiring the space • Among planners, DES, and real-estate leasing • Lack of communication between DES, Real Estate Leasing, and OBSF • Planning and SJD work well, but leasing needs more people • Space decision-making process does not consider the implications of leasing widely dispersed off campus sites that NIH corporate infrastructure must support • There are perceptions of inequity among IC’s for space • Method of providing space to “newer” IC’s needs to be improved • Central Service facilities are given a low priority – needs are not met. • ORS needs to have a better understanding of billing procedures and rules • Get clarification from OBSF regarding procedures/rules for TIA • IC’s do not have full knowledge of associated costs of leased space • Inequities in the way IC’s are billed for space by ORS that are not being corrected by ORS • IC was billed, however space was unusable • ICs would like more information on the B&F budget • ICs would like more independence in managing and renovating their leased space Note: Include themes from both FY02and FY03 comments.

More Related