Follow up bootstrap case study
Download
1 / 15

Follow-up Bootstrap Case Study - PowerPoint PPT Presentation


  • 66 Views
  • Uploaded on

Follow-up Bootstrap Case Study. The Measurement Choice. The follow-up decision was defined as whether to proceed with the project as planned or make a significant reduction in scope by removing functions ( ) The VIA of this decision indicated that risk of cancellation was a key variable

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Follow-up Bootstrap Case Study' - skyler-pierce


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

The measurement choice
The Measurement Choice

  • The follow-up decision was defined as whether to proceed with the project as planned or make a significant reduction in scope by removing functions ( )

  • The VIA of this decision indicated that risk of cancellation was a key variable

  • Further calibrated estimates and decomposition were uninformative and insufficient historical data exists for creating an “actuarial” model

  • Bootstrapping the chance of cancellation was judged to be the most feasible measurement method

  • Additional investments may use this bootstrap model


Bootstrapping overview
Bootstrapping Overview

  • Historical analysis of IT investments

  • First Workshop:

    • Review history

    • Identify Success Factors

    • Confirm possible ranges

  • Design test assessments

  • Second Workshop

    • Calibrate for binary questions

    • Conduct collaborative assessment

  • Independent assessments

  • Compute regression model

  • Confirm model


Questions for initial planning
Questions for Initial Planning

  • Is bootstrapping necessary? (explain alternatives and when bootstrapping is good)

  • Have a kickoff: explain objectives and approach w/ specific examples/success stories, studies show that bootstrap are improved

  • What is the scope of the portfolio?

  • What outcome is to be bootstrapped?

  • What historical information is obtainable and where?

  • Who are the decision makers?

  • Who will be attending the workshops?

  • Schedule the workshops, interviews, and the presentation to validate the model


Project planning estimates
Project Planning Estimates

  • Historical data gathering: 1-2 people, 1-3 days

  • Preparation for 2 workshops: 1-2 people, 2-4 hours each

  • Conduct 2 workshops: 1-2 facilitators + participants, 1/2 day (3-4 hours) each

  • Construct initial bootstrap list: 1 person, 1-3 hours

  • Construct final bootstrap list: 1 person, 1-3 hours hour

  • Build regression model: 1-2 people, 4-8 hours

  • Prepare for presentation to confirm model: 1-2 people, 6-8 hours

  • Conduct presentation to confirm model: 1-2 presenters + participants, 1 hour


Historical analysis
Historical Analysis

  • Determine scope of historical data needed

    • How far back do we need data? Up to 30 examples

    • Do we need investment size, duration, status, objective, etc.? (have standard list)

  • Identify historical data available on IT investments

    • Budgeting process/accounting data

    • IT staff memory

    • Any metrics efforts

    • Past strategic IT plans

  • Collect investment data

  • Consolidate data into single table for hand out


First workshop objectives
First Workshop Objectives

  • The first Bootstrap workshop is meant to be a free-form brainstorming forum to address the following:

    • Introduce concepts/objectives to new participants

    • Review the historical data and attempt to spot trends and success factors

    • Which investments were extreme examples for the variable being bootstrapped

    • List potential predictive variables

    • Determine realistic values of predictive variables including combinations of values

    • Define criteria for bootstrap output

    • Agree on input consolidation rules – shall we just average the group, throw out highest/lowest, etc.


Results of first workshop
Results of First Workshop

  • We identified the scope of the portfolio as any randomly chosen this organizations investments

  • There were 4 participants

  • We identified the following variables as pertinent to a follow-up measurement on chance of cancellation:

    • Is the investment a documented strategic initiative?

    • 90% confidence interval for time remaining (months)

    • Is some part of the investment a compliance requirement?

    • The number of business units involved

    • Is sponsor business, IT or corporate?

    • % over-budget and % over-schedule

    • Test score of staff regarding project plan knowledge

    • Project manager and sponsor evaluation of project

    • % deliverables complete


Design test assessments
Design Test Assessments

  • Using the identified predictive variables, generate a list of hypothetical investments

  • The range of individual values should reflect the actual portfolio – ie. You should not have mostly investments over $50 million if that size is rare for this client

  • The combination of values in each hypothetical investment should be realistic – ie. The size and duration should fit each other

  • Make sure list represent investments in a range of possible bootstrapped output values

  • Produce a short table that lists each investment with hypothetical values and blanks for their input (perhaps 10 investments)


Second workshop
Second Workshop

  • Calibrate for binary questions

  • Present trial investment list (just 5 investments), explain values shown and inputs needed

  • Discuss each investment as a group

  • Identify changes to list

  • Obtain calibrated estimates for each

  • Explain next steps


Prepare final bootstrap
Prepare Final Bootstrap

  • Modify constraints based on findings from second workshop

    • Clarify definitions/units of measure

    • Add/drop variables

    • Confirm input ranges

  • Generate new list of hypothetical investments

  • The list should be enough to produce at least 100 responses total and no less than 30+# of variables per evaluator

  • Randomize list order

  • Options:

    • Make some investments duplicates (for measuring consistency)

    • Include a few best/worst case investments


Calibrated estimation results
Calibrated Estimation Results

  • Each evaluator assessed chance of cancellation for 48 investments

  • Variance between evaluators was often very large but may have been less if we did the trial evaluation or calibration

  • Olympic scoring throws out highest and lowest

  • Disagreement among evaluators averaged 16% but was as much as 60%

  • Difference between Olympic scores of duplicates was 6%

  • Nobody stood out as particularly inconsistent or consistent but Ando and Vinay were clearly more optimistic than Jean-Rene and Cecile

  • Clearly, these chances of cancellation are high for any RAVI project

100%

90%

80%

70%

VKU

60%

JRR

50%

CPP

AAN

40%

Olympic

30%

20%

10%

0%


Compute regression
Compute Regression

  • Aggregate inputs of various estimators

  • Convert input into quantities

    • Pivot tables on un-ordered and discrete but non-binary variables

    • Graph continuous variables against output to look for obvious non-linear relationships

  • For each output variable (confidence of success, chance of cancellation, etc.) compute a regression model

  • Try combinations of higher order terms where you think there is a compounding effect

  • Size is always a good candidate for higher-order terms

  • Compare model error to evaluator inconsistency (model error should be less)

  • Test changes in “controllable” success factors – this may identify sub-zones


Confirm results
Confirm Results

  • To confirm results show each of the following:

    • Plot of the original estimates vs. the model

    • The test classification chart

    • Plot actual projects on classification chart and discuss discrepancies

  • Determine volumes in each zone to check if support is realistic

  • Present results to group


Regression results
Regression Results

  • Each investment was described by 12 but the model reduced this to 8.

  • After a few regression models were tried, one was found with an R squared of 0.91

  • Higher-order variables were added such as one which considered level of over-budget only if the investment was neither strategic or compliance

  • Part of the variance from the Olympic to the Model was due to evaluator inconsistency, not actual error in the model

Comparison of estimates to model

1

0.9

0.8

0.7

0.6

Model Estimate

0.5

0.4

0.3

0.2

0.1

0

0

0.2

0.4

0.6

0.8

1

Olympic score of calibrated estimates


ad