1 / 40

British Columbia Ministry of Forests Post-Implementation Review Framework June 2004

British Columbia Ministry of Forests Post-Implementation Review Framework June 2004 Engagement: 220636851 Version 1.0. Table of Contents. Table of Contents. Introduction PIR objectives PIR model PIR process. Introduction. Introduction.

summer-kerr
Download Presentation

British Columbia Ministry of Forests Post-Implementation Review Framework June 2004

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. British Columbia Ministry of Forests Post-Implementation Review Framework June 2004 Engagement: 220636851 Version 1.0

  2. Table of Contents

  3. Table of Contents • Introduction • PIR objectives • PIR model • PIR process

  4. Introduction

  5. Introduction • In light of the ever-increasing size and complexity of technology projects, organizations need to constantly review how projects are delivered and whether or not the project delivered the benefits that were anticipated. • As well, organisations need to continually build on the lessons they are learning though the course of a project to ensure that the same mistakes are not being repeated over and over again but also to capitalise on best practices identified in a project and propagate those to other projects within the organisation. • Gartner advises that public sector organizations implementing projects of the scope and magnitude of those of the ministry should perform a project assessment to assure project deliverables are aligned - not only with stated objectives of the project - but also with those of the government as a whole and the constituents it serves. • Experience demonstrates that the value of an independent review - that can be conducted either by a government audit organization or authority, or by an external trusted party - is to highlight new priorities and suggest which on going initiatives should be reconsidered, accelerated or initiated to fit those priorities. Performing this as a totally independent review is very important, given its implications and possible influences.

  6. Guiding principles Some key guiding principles were followed though the development of this framework: • Alignment with British Columbia Government standards • The framework was built taking into consideration the following documents: • 1. MoF Administrative Closure • 2. XGOV PIR • Alignment with industry best practices • Several standards were use in the definition of the framework: • Project management institute: As the main standard-setting body in project management, PMI has defined the best practices for project management used in this framework. • Gartner independent review and verification: Inspired form the IEEE Standard 1059–1993, Guide for Software Verification and Validation Plans, it describes best practices with respect to application development. • Simple and easy to understand • Since PIR are often seen as “non-value added” by the project teams, the framework aimed at being simple to understand (requiring a minimum of investment in time to understand it) and easy to administer. • Actionable • The major benefit from a PIR is not in understanding what went right or wrong, but in ensuring that we can apply those lessons learned on other projects. • The framework focus heavily on identifying those lessons learned and defining prevention / remediation plans to apply those lessons learned in other projects. Bc Im/it coe policy

  7. PIR’s objectives

  8. Objectives of a PIR • The purpose of a post-implementation review (PIR) is to evaluate an investment after it has completed development (e.g., after its transition from the implementation phase to the operation and maintenance phase) in order to validate actual investment results. This review is conducted to (1) examine differences between estimated versus actual investment costs and benefits and possible ramifications for unplanned funding needs in the future and (2) extract “lessons learned” about the investment selection and control processes that can be used as the basis for management improvements. Similarly, PIRs should be conducted for investment projects that were terminated before completion to readily identify potential management and process improvements. • The timing of a PIR can be problematic – a PIR conducted too soon after a investment has been implemented may fail to capture the full benefits of the new system. In contrast, the institutional knowledge about a investment can be lost if the PIR is conducted too late. As a general guideline, PIRs should be conducted within a range of 6 to 18 months after the investment begins its operational phase. However, this guideline should be adjusted depending upon the nature of the investment project and expectations for the timing of benefit realizations documented in the project plans.

  9. PIR Model

  10. Seven evaluation domains The framework is based on seven evaluations domains. These domains aim at providing the greatest coverage of the project critical success factors while keeping the framework simple. By evaluating each of these seven domain, BC MoF will be able to gain a much better understanding of what went right and what went wrong in its projects. This framework is only to be used as a tool to guide the organisation in the definition of best practices. BC MoF will have to determine how the results of this framework can best serve in the definition of best practices. Evaluation domains 1. Project benefits 2. Project management 3. Requirements Definition 4. Project Organization 5. Quality Assurance 6. Risk Management 7. Project communication

  11. Evaluation domain:1. Project benefits Objectives: Validate that the benefits were properly identified and realized throughout the course of the project. Elements evaluated: • Benefits identification • Benefits realization schedule • Ownership of planned benefits • Benefit realization Rational: The benefits of a project are the only reason the project is executed in the first place. If a project fails to meet its objectives (or to define them), then it has no reasons to be.

  12. Evaluation domain: 2. Project management Objectives: Validate that project management best practices were followed throughout all the phases of the project. Elements evaluated: • Initiation phase: Appropriate charter • Planning phase: Project plan, understanding of the project objectives by every member of the project team • Execution phase: Attention to quality, time and costs • Controlling phase: Control of the key project metrics, attention to changing circumstances, validation of hypothesis. • Closing phase: Validating that objectives were met, sign-off on all deliverables. Rational: Proper project management is essential for the execution of a project. Too often, projects are planned to a great level of details only to realize that those plans were never followed throughout the project.

  13. Evaluation domain: 3. Requirements Definition Objectives: Validate the process used to gather requirements and the validity of the requirements identified throughout the project. Validate the mechanism to identify trade-offs between the requirements and project constraints. Elements evaluated: • Process used to gather requirements. • Relationship between requirements definition and Software Development Life Cycle. • Visibility of requirements for gap analysis, release management, design review, testing and training. Rational: Requirements definition is at the basis of the majority of IT projects. Without proper mechanisms and processes to gather requirements, the project might not be working towards the achievement of its business objectives, by working on too many, too few, or the wrong requirements.

  14. Evaluation domain: 4. Project Organization Objectives: Validate that the project was composed of the right mix of vendors, leadership, business and technical resources. Elements evaluated: • Leadership from the primary business unit(s) and support organization(s). • Mid-management and staff involvement. • Alignment with business. • Skills and experience of the project management staff. • Dealing with contractual issues. • Alignment of the vendors’ own interest to that of the project. • Roles and responsibilities of the entire implementation team Rational: Projects composition tends to be very homogeneous (all IT people for example), promoting groupthink and a lack of understanding of the other realities surrounding the project and its stakeholders.

  15. Evaluation domain: 5. Quality Assurance Objectives: Validate that the project followed best practices with respect to its quality assurance program. Elements evaluated: • Linkage to requirements. • Linkage to risks identified. • With sufficient resources, tool (e.g., robots) and time for execution. • Allowance for regression testing. • Allowance for performance testing. • Feedback to developers. Rational: Oftentimes, in the sake of “getting it done”, quality is compromised leading to product defects. A sound, rational quality assurance program ensures that a consistent product is delivered.

  16. Evaluation domain: 6. Risk Management Objectives: Validate that the project risks were identified and managed throughout the course of the project. Elements evaluated: • On-going risk management plan – identification / assessment of project risks and risk mitigation steps. • Visibility of project risks to the entire project team (internal and vendors). • Ability to execute risk mitigation plans – e.g., additional resources, changes in schedule, communication with end users, etc. • Striking balance between project risks versus time and budget. The importance of having appropriate and sufficient documentation, • Linkage of risks with product support strategies (e.g., develop support strategies at the outset not after the fact). Rational: Project teams are typically good at identifying risks but less so in managing them. Oftentimes, the risk management process is neglected during project execution, in the most critical phase of the project.

  17. Evaluation domain: 7. Project communication Objectives: Validate that the project team identified the project stakeholders, developed a communication plan and executed on the plan. Elements evaluated: • Identification of the project stakeholders • Identification of their communication requirements • Involvement of project stakeholders through various mechanisms Rational: Project teams tend to isolate themselves from the project stakeholders, seldom communicating risks and issues leading to surprises for many of the people involved when an issue degenerate. Proper communication can resolve some of these issues before they become problematic.

  18. Assessment scale • The following slide describes the assessment scale to be used in the assessment of the different evaluation domains. • Each of the domains are rated based on the information gathered by the PIR team. The assessment is not meant to be a rigorously scientific exercise. Instead, it is meant to allow flexibility to the reviewers to look at each evaluation domains as a whole and determine if their level was appropriate for the size and realities of the project.

  19. Assessment scale 5 Best-in-class Represent an example for others to follow. 4 Success Best practices were followed systematically. 3 Passable Best practices were followed most of the time. 2 Ad hoc Mostly reactive approach to the project. 1 Failure The project didn’t meet expectations

  20. Key questions? • Where does technology fits in? • Projects rarely fails because of the technologies they use. Instead they fail because of a mismatch of the technology for the need (selecting the solution before knowing the problem) or because of risks inherent to the selected technology (unproven technology, lack of skills and experience, bleeding-edge application). These are addressed within the risks, requirements definition and project management domains.

  21. PIR Assessment

  22. Assessment criteria • The following pages describes some of the evaluation criteria that are suggested for each of the evaluation domains. • The evaluation criteria represent only a portion of all the topics than cold be evaluated under each domain. We believe that by using this subset, BC MoF will be able to obtain a very good understanding of the project status. • The evaluation domains are rated based on the evaluation criteria, but not in a direct relationship. The input from the evaluation criteria are used to assess the overall rating of the evaluation domain. This prevents the positive assessment of an evaluation domain where all the best practices were consistently followed, except for a key one that caused the failure of a project.

  23. Assessment criteria1. Project benefits • Are the benefits identified? • Are the benefits quantifiable? • Were the benefits met? • Are the benefits assigned to an owner (sponsor for example)? • Are the benefits realistic • Is there a benefit realization schedule? • Is the benefit realization schedule realistic?

  24. Assessment criteria2. Project management • Project initiation • Is there a project charter? • Is there a project sponsor • Is the sponsor accountable for the project objectives? • Is the project a priority for the parties involved? • Project planning • Is there a project plan? • Is the project plan complete? • Did the entire team participate in the creation of the project plan? • Did the project sponsor sign-off on the project plan? • Did the client sign-off on the project plan? (not always relevant) • Did the project team sign-off on the project plan? • Are project dependencies and interdependencies identified? • Does the project takes into account of administrative processes (procurement for example)? • Project execution • Are the key metrics in place and reported regularly? • Are the required resources assigned to the project? • Did the team follow the project team? • Was the maintenance team included in the development?

  25. Assessment criteria2. Project management • Project control • Is there a change management mechanism? • Are the deliverables validated by the “client”? • Is the project delivering in frequent, small phases? • Is the project steering committee engaged in project decisions? • Is the project team engaging the different stakeholder groups regularly? • Was the project plan updated to reflect changing conditions? • Was there frequent project status reports to project sponsor? • Project close-out • Were the administrative project closure activities completed? • Were all deliverables signed-off by the “client”? • Did the project meet its objectives?

  26. Assessment criteria3. Requirements definition • Is there a process to gather requirements? • Was the process followed? • Is there a sign-off process for users, sponsor? • Is there a mechanism for users to change their requirements (change management)? • Is there a mechanism for trade-offs between requirements and project constraints?

  27. Assessment criteria4. Project organization • Is there an assigned project manager? • Is there a project manager form the “client” organisation? • Is there an assigned project team? • Is there sufficient resources for the project? • Are the resources qualified for the project? • Was the turnover-rate for the project high? • Is there a project steering committee? • Is the composition of the project steering committee composed of representatives from the different stakeholder groups? • Is the project steering committee engaged in the decision-making process?

  28. Assessment criteria5. Quality assurance • Is there a quality assurance plan? • Are there quality metrics (targets and actuals)? • Was the quality assurance process followed?

  29. Assessment criteria6. Risk management • Were the risks for the project identified? • Was there a mitigation plan built for the identified risks? • Was there a mechanism to monitor risks throughout the project? • Was there contingency plans identified for major risks? • Was there a budget assigned to contingency plans?

  30. Assessment criteria7. Project communication • Were the stakeholders identified for the project? • Is there a communication plan for each stakeholders / stakeholder groups? • Was the communication plan followed? • Did the communication plan account for communication-style of the stakeholder groups?

  31. PIR Process

  32. Who does a PIR? • Project management office • Typically responsible for most of the PIRs. Would have a PIR as part of the normal project close-out process. • Appropriate for non-eventful projects, where the project is typically considered a success, in order to gain some lessons learned. • External 3rd party • Typically done for projects considered challenging of for project having failed. • Third parties have no stake in the project and can offer an impartial view which gives more credibility to the PIR results. • Audit team • The organization’s audit team could perform the PIR as part of their internal audit duties. • Does require some project management and technical competencies. • Senior management • Unusual, senior management would typically rely on external 3rd party. • Senior management would typically be the ones asking for a PIR for a project having failed.

  33. Data collection Analysis Validation Action plan PIR phases There are four phases in a post-implementation review (PIR): data collection, analysis, validation and action plan. Each phase builds from the previous one. The following pages describes each phases in more details. One of the main issue with respect to PIR is the perceived lack of impartiality, the project team often feels the PIR team is trying to pin the blame on someone. It is therefore very important for the project team to have a chance to review and validate the result of the assessment before presenting them to anyone outside of the project. In the event of a disagreement between the PIR team and the project team, the concerns of the project team and the reasons for dismissing them should be documented in the PIR report. PIR Phases

  34. Data collection Phase 1. Data collection Analysis Validation Action plan Objectives: Identify the data sources and Collect the data necessary to perform the PIR. Process: • Surveys: In some circumstances, survey can be used to gather end users impression of the project. • Interviews: Review the evaluation domains with the project sponsor, project manager, project team and key stakeholders of the project. • Documentation review: Review the project charter, project plan, project status report, risk reports, project control book. Deliverables: • Documentation list • Interview results

  35. Data collection Phase 2. Analysis Analysis Validation Action plan Objectives: Analyse the data gathered in order to generate a preliminary assessment. Process: • Compare interview results to best practices. Point out discrepancies • Score the evaluation domains from one to five according to the assessment scale. • Document the major themes and hypothesis Deliverables: • Preliminary assessment • Major themes

  36. Data collection Phase 3. Validation Analysis Validation Action plan Objectives: Validate the preliminary assessment with the project team in order to 1) ensure that the assessment is accurate and 2) get buy-in from the project team. Process: • Validate the preliminary assessment with the project manager. • Validate the preliminary assessment with the project team. • Update the assessment in light of the validation process. Deliverables: • Validated assessment

  37. Data collection Phase 4. Action plan Analysis Validation Action plan Objectives: Develop an action plan to address issues or best practices identified in the assessment. Process: • Identify the root causes for the issues identified with the project team • Identify best practices to be repeated in the future • Develop a prevention / remediation plan for the issues and best practices identified • Assign ownership for the prevention / remediation plan • Document findings and action plan in a PIR report Deliverables: • Action plan with ownerships • PIR Report

  38. 1. Executive summary Overall Assessment Lessons Learned Follow-up Actions 2. Background Project Background and Objectives Scope, terms of reference, direction and project team 3. Project assessment and measurement 1. Project benefits 2. Project management 3. Requirements Definition 4. Project Organization 5. Quality Assurance 6. Risk Management 7. Project communication 4. Observation an analysis Best practices identified Major issues identified Recommendations for improvement 5. Recommendation and conclusion Prevention / remediation plan Ownership of the action items Follow-up schedule PIR Report Table of Content

  39. Example of a PIR assessment In this high-level assessment, we can see that the project was well-managed but failed to do proper quality assurance.

  40. Gartner Contact Information Simon Chapleau Gartner Consulting Telephone: +1-514-331-7488 Facsimile: +1-514-331-4427 E­mail: Simon.Chapleau@gartner.com

More Related