1 / 38

SCRAM: Schedule Compliance Risk Assessment Methodology

SCRAM: Schedule Compliance Risk Assessment Methodology. CSSE ARR - March 2012 Los Angeles USA Adrian Pitman Director Materiel Acquisition Improvement Defence Materiel Organisation Australian Dept. of Defence. What does SCRAM mean?. Shoo - Go away!

khan
Download Presentation

SCRAM: Schedule Compliance Risk Assessment Methodology

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SCRAM: Schedule Compliance Risk Assessment Methodology CSSE ARR - March 2012 Los Angeles USA Adrian Pitman Director Materiel Acquisition Improvement Defence Materiel Organisation Australian Dept. of Defence

  2. What does SCRAM mean? • Shoo - Go away! • Secure Continuous Remote Alcohol Monitoring • As modeled here by Lindsay Lohan • Schedule Compliance Risk Assessment Methodology

  3. SCRAM Schedule Compliance Risk Assessment Methodology • Collaborative effort: • Australian Department of Defence - Defence Materiel Organisation • Systems and Software Quality Institute, Brisbane, Australia • Software Metrics Inc., Haymarket, VA

  4. DMO SCRAM Usage • SCRAM has been sponsored by the Australian Defence Materiel Organisation (DMO) • To improve our Project Schedule Performance in response to Government concern • DMO equips and sustains the Australian Defence Force (ADF) • Manages 230+ Concurrent Major Capital Equipment Projects & 100 Minor (<$20M) defence projects

  5. DMO SCRAM Usage (cont.) • SCRAM has evolved from our reviews of troubled programs: • Schedule is almost always the primary concern of program stakeholders • SCRAM is a technical Schedule Risk Analysis methodology that can identify and quantify risk to schedule compliance (focus risk mitigation) • SCRAM is also used to identify root cause of schedule slippage to allow remediation action

  6. What SCRAM is Not • Not a technical assessment of design feasibility • Not an assessment of process capability • However, process problems may be identified and treated as an issue if process performance is identified as contributing to schedule slippage

  7. Topics • Overview of SCRAM • SCRAM Components • Process Reference and Assessment Model (ISO 15504 compliant) • Root Cause Analysis of Schedule Slippage (RCASS) Model • SCRAM Assessment Process • Experiences using SCRAM • Benefits of Using SCRAM

  8. SCRAM Components • The SCRAM contains the following: • Root Cause Analysis of Schedule Slippage (RCASS) Model • SCRAM Process Reference/Assessment Model • SCRAM Assessment Process • Review Preparation & Project Awareness • Risk and Issue Identification • Schedule Health Check • Monte Carlo Analysis • Reporting • Reference Model and Assessor Training, Assessors Guidebook (Currently under development)

  9. SCRAM PR/AM Model • Rework Planning • Rework Management • Technical Debt Mgt. (To be added in next update) Developed by the SCRAM Team ISO 15504 Compliant (Levels 1 and 2) Contains Processes, Practices, implementation evidence All PR/AM Processes are traceable to a Category within the RCASS Mode; Model in public domain. Download from: www.scramsite.org

  10. SCRAM-RCASS Stakeholders Subcontractors Requirements Functional Assets Workload Rework Staffing & Effort Management & Infrastructure Schedule & Duration Schedule Execution Adapted from Integrated Analysis Model in McGarry et al., Practical Software Measurement: Objective Information for Decision Makers

  11. Root Cause Analysis of Schedule Slippage (RCASS) • Model evolved with experience on SCRAM assessments • Used as guidance for: • Questions during assessments • Categorizing the wealth of data and details gathered during an assessment • Highlighting missing information • Determining the root causes of slippage slippage • Recommending a going-forward plan • Recommending measures to serve as leading indicators • For visibility and tracking in those areas where there are risks and problems • Similar to the use of the Structured Analysis Model in PSM to guide categorization of issues and risks via issue identification workshops

  12. 1.0 Assessment Preparation 2.0 Project Awareness 3.0 Project Risk / Issue Identification 4.0 Project Schedule Validation 6.0 Schedule Compliance Risk Analysis 5.0 Data Consolidation & Validation Schedule Compliance Risk Quantified 7.0 Observation & Reporting SCRAM Assessment Process

  13. SCRAM Key Principles • Minimal Disruption • Artefact Review (plans, procedures, model evidence) conducted offline • Information is collected one person at a time • Interviews typically last an hour • Independent • Review team members are organizationally independent of the program under review • Non-advocate • All significant issues and concerns are considered and reported regardless of origin or source (Customer and/or Contractor). • Some SCRAM reviews have been joint contractor/customer team – facilitates joint commitment to resolve review outcomes

  14. SCRAM Key Principles (cont.) • Non-attribution • Information obtained is not attributed to any individual • Focus is on identifying and mitigating the issues/risk • Corroboration of Evidence • Significant Findings and Observations based on at least two independent sources of corroboration • Rapid turn-around • One to two weeks spent on-site • Executive out-briefing presented at end of second week • Written Report (including team recommendations) after a following two weeks

  15. SCRAM Team Composition • Assessment conducted by a small team including: • Engineer Assessors to Validate the Technical Baseline • Validate WBS, engineering-related basis of estimates (BoEs), work load estimates, assess technical risk, identify technical debt • Scheduler (experienced in the project schedule tool) • Validates schedule construction • Performs the schedule health checks • Run Monte Carlo • Other domain specialists as needed • E.g. – Flight Test Engineer to assess airworthiness regulatory certification and adequacy of Flight Test DT&E program

  16. Schedule Review and Health Checks • To evaluate schedule construction and logic • Includes analyses of task duration, dependencies, logic, constraints, schedule float (+ or –ve), successors, etc. • Government, Prime, and Subcontractor schedule integration / alignment is reviewed • Ensure external dependencies are included and linked in the schedule logic • Interface Availability, Specialist Resources, Government Furnished Equipment (GFE) delivery, Integration & Test facilities, Test Assets etc • Assess Critical Path • Is there contingency in the schedule if risks are realized? • Any Rework contingency? • Or is the schedule so tight that nothing can go wrong?

  17. Monte Carlo Analysis • Based on the SCRAM risk analysis allocate three point estimates to tasks/work packages on critical and near-critical path based on identified risks from RCASS and run Monte Carlo • optimistic, pessimistic & most likely task duration • Example

  18. Topics • Overview of SCRAM • SCRAM Components • RCASS Model • Assessment Process • Experiences using SCRAM • Benefits of Using SCRAM

  19. Some Statistics • Number of SCRAM Assessments to date • 10 DMO Project assessments conducted using SCRAM or SCRAM principles (i.e. before SCRAM was formalised) • Number of years • First review conducted in 2007 • Time on Site • Typically two weeks – ONE WEEK INTERVIEWS/DATA COLLECTION – ONE WEEK DATA CONSOLIDATION, ANALYSIS AND OUT-BRIEF • Typical Assessment Team size: • SCRAM Lead plus 4 or 5 assessors • Team experience: typically 150 - 200 years Project experience

  20. Communicate to Stakeholders Rework Monitor & Control Customers Requirements Status and Report Workload Estimates Baseline & Execute Schedule Assess & Mitigate Risk Execute Schedule Staffing & Effort Forecast & Replan Execute & Manage Schedule and Duration Construct Schedule What Underpins Good Project Schedule Performance

  21. What Causes Projects to Slip? • There are often multiple causes of schedule slippage: • poor planning and schedule construction • issues that arise during schedule execution • Once risk and root causes have been identified, they can be mitigated/remediated Construction Stage Causes • Ambiguous, misunderstood requirements • Inadequate planning • Poor schedule construction • Poor schedule estimation • Overly optimistic estimates • Underestimate technical challenges • IPDP not factored • Inadequate SI&T Planning • Unplanned dependencies • Execution Stage Causes • ‘Actual’ productivity below estimated • Requirements volatility • Schedule not used as a communication or project management tool • Poor Risk Management/Mitigation • Technical Debt + Interest • Rework (not scheduled, no contingency) • Inadequate resources (inc. staff and skills) • Stakeholder involvement • External unforeseen factors

  22. SCRAM-RCASS Stakeholders Subcontractors Requirements Functional Assets Workload Rework Staffing & Effort Management & Infrastructure Schedule & Duration Schedule Execution Adapted from Integrated Analysis Model in McGarry et al., Practical Software Measurement: Objective Information for Decision Makers

  23. “Our stakeholders are like a 100-headed hydra – everyone can say ‘no’ and no one can say ‘yes’.” Stakeholders • Experiences • Critical stakeholder (customer) added one condition for acceptance that removed months from the development schedule • Failed organisational relationship, key stakeholders were not talking to each other (even though they were in the same facility)

  24. Requirements • Experiences • Misinterpretation of a communication standard led to an additional 3,000 requirements to implement the standard. • A large ERP project had two system specifications – one with the sponsor/customer and a different specification under contract with the developer – would this be a problem? • . What was that thing you wanted?

  25. Subcontractor • If the subcontractor doesn’t perform, additional work required by the Prime • Subcontractor Management essential • Experiences • Subcontractor omitting processes in order to make delivery deadlines led to integration problems with other system components • Prime and sub-contractor schedules not aligned

  26. Functional Assets • Commercial-off-the-shelf (COTS) products that do not work as advertised, resulting in additional work. • Experiences • COTS product required a “technology refresh” as the project was years late (cost the project $8M) • IP issues with the COTS product resulted in additional development effort required • Over 200 instantiations of Open Source code resulted in breach the indemnity requirements

  27. Workload • Overly Optimistic estimates • Inadequate BoEs • Source lines of code grossly underestimated • Contract data deliverables (CDRLs) workload often underestimated by both contractor and customer • Experiences • Identical estimates in four different areas of software development (Cut & Paste estimation) • Re-plan based on twice the historic productivity with no basis for improvement • Five delivery iterations before CDRL approval

  28. Staffing & Effort • High turnover, especially among experienced staff • Experiences • Generally, most project had a lack of human resources • Parent company sacking 119 associated workers • Remaining workers went on a “Go Slow” • Scheduling staff for 12 hours (to recover schedule)

  29. Schedule & Duration • Area of primary interest • Experiences • No effective integrated master schedule to provide an overall understanding of the completion date of the project • 13 subordinate schedules • Critical Path went subterranean!

  30. Schedule Execution • Using the schedule to communicate project status and plans • Experiences • The schedule was not available to program staff or stakeholders • Undergoing a schedule tool transition for approx 2 years

  31. Rework • Often underestimated or not planned for (e.g. defect correction during integration and test) • Experiences • No contingency built into the schedule for rework • Clarification of a requirements resulted in an additional 7 software releases (not originally planned or scheduled)

  32. Management & Infrastructure • Contention for critical resources • Processes • Risk Management, Configuration Management • Experiences • Lack of adequate SI&T facilities (in terms of fidelity or capacity) concertinaed the schedule causing major blow out

  33. Topics • Overview of SCRAM • SCRAM Components • RCASS Model • Assessment Process • Experiences using SCRAM • Benefits of Using SCRAM

  34. SCRAM Benefits • SCRAM root-cause analysis model (RCASS) useful in communicating the status of programs to all key stakeholders • Particularly senior executives (They get it!) • Identifies Root Causes of schedule slippage and permits remediation action • Identifies Risks and focuses Risk Mitigation action • Provides guidance for collection of measures • Provides visibility and tracking for those areas where there is residual risk • Provides greater confidence in the schedule

  35. SCRAM - Benefit • SCRAM can be used to validate a schedule before project execution (based on SCRAM identified systemic schedule management issues) • Widely applicable • SCRAM can be applied at any point in the program life cycle • SCRAM can be applied to any major system engineering activity or phase • Examples • System Integration & Test (often – as this is typically when the sea of green turns to a sea of red) • Aircraft Flight Testing • Installation/integration of systems on ship • Logistics Enterprise Resource Planning (ERP) application roll out readiness

  36. Schedule 1. Develop a complete critical path network and prepare for SRA. 6. Document results. 2. Identify critical milestones for risk quantification. Most Likely 3. Enter risk parameters. Min. Max. 4. Run schedule simulation & quantify impact of risk on schedule. Incorporate into Risk Management Processes 5. Analyze schedule results. % Time SCRAM – Can be repeated to measure effectiveness The impact of project risk is assessed and critical risks fed back for risk reduction 7. Present position to program office. Develop risk reduction actions. Chart: Courtesy NAVAIR 4.2

  37. SCRAM QUESTIONS For further information on SCRAM contact: www.scramsite.org Govt/Govt to Govt: - Adrian Pitman: adrian.pitman@defence.gov.auIn Australia: - Angela Tuffley: a.tuffley@ssqi.org.auIn USA: - Betsy Clark: betsy@software-metrics.comIn USA: - Brad Clark: brad@software-metrics.com

  38. Acronyms • ANAO – Australian National Audit Office • BoE – Basis of Estimate • COTS/MOTS – Commercial off the Shelf/Modified or Military off the Shelf • DMO – Defence Materiel Organisation (Australia) • GFE – Government Furnished Equipment • ISO/IEC – International Organization for Standardization/International Electrotechnical Commission • ISO/IEC 15504 – Information Technology – Process Assessment Framework • RCASS – Root Cause Analysis of Schedule Slippage • SCRAM – Schedule Compliance Risk Assessment Methodology • SMI – Software Metrics Inc. (United States) • SSQi – Systems & Software Quality Institute (Australia)

More Related