1 / 21

27 ISMOR

Learning from Experience (LFE) - MCA, SWIFT & HAZOP. 27 ISMOR. Chris Tilney, Atkins Ltd 2 September 2010. Structure of Presentation. Multi Criteria Analysis (MCA) Lessons learned from decision support applications in defence

yardan
Download Presentation

27 ISMOR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Learning from Experience (LFE) - MCA, SWIFT & HAZOP 27 ISMOR Chris Tilney, Atkins Ltd 2 September 2010

  2. Structure of Presentation Multi Criteria Analysis (MCA) Lessons learned from decision support applications in defence Structured What If Technique (SWIFT) & Hazard & Operability (HAZOP) Brief introduction to the techniques Lessons learned from recent nuclear safety applications Potential non safety related applications in defence

  3. Multi Criteria Analysis (MCA)

  4. MCA – LFE (1) Context for lessons learned Primarily focused on application of MCA to UK MoD procurement competitions i.e. down select decisions on Pre Qualification Questionnaire (PQQ) responses and Tenders Personal opinions of presenter, not necessarily shared by other Atkins staff or MoD personnel!

  5. MCA – LFE (2) Structure of Assessment Hierarchies Derivation of Weighting Factors Preparing the Evaluation Team Scoring Criteria Aggregation Audit Trail/Toolset Presentation of Results

  6. MCA – LFE (3) Structure of Assessment Hierarchies Tower 1 Tower 2 Tower 3 • Senior Decision Makers/Scrutiny favour: • Towers not weighted relative to each other • No single, overall Figure Of Merit (FOM) score • Small set of criteria (15 – 20) works best Merits of solution offered Capability to deliver solution Commercial Aspects Performance System Requirements Document (SRD) Compliance People Processes Tools Plans Delivery Schedule Price Terms & Conditions (T&Cs)

  7. MCA – LFE (4) Derivation of Weighting Factors Pairwise comparison works best. Three ‘rounds’ to derive weights: 1st round: initial pairwise scores followed by explanation of outliers 2nd round: score again then calculate weights based on consensus scores 3rd round: review weights, adjust if necessary (with rationale) Group dynamics, watch out for/control: Dominant personalities Deferral to rank (if military Subject Matter Experts (SMEs)) Non assertive gurus Preparing the Evaluation Team Evaluation Guide: helps enforce coherency, transparency, robustness of scoring process Kick-off group brief: ditto 1-2-1s on toolset: reduce risk of failing to keep to evaluation schedule

  8. MCA – LFE (5) Scoring Criteria Scoring scale Absolute 0 to 10 point scale 3 reference points: 10, 7 & 4 with guidance for assessors No ‘extra’ marks for exceeding requirement Confidence adjusted scores (demonstrable/evidence based capability) Minimum of 2 Assessors: Lead & Shadow(s) Moderators: resolve disagreements, holistic view/input to assessors Aggregation Automate at lower levels of hierarchy “Man-in-loop” at higher levels of hierarchy Trigger points/capability thresholds for all criteria that are scored

  9. MCA – LFE (6) Audit Trail/Toolset - AWARD Simultaneous, distributed input (over Restricted LAN Interconnect (RLI)) Strong functionality for: Capturing assessor rationales, issues & concerns Monitoring progress of evaluation Reports/interrogation Presentation of Results KISS principle to avoid information overload for Assessment Panel Members Executive Summaries of what is being offered Bar graphs for Towers 1 & 2 ‘Horse blanket’ for Tower 3 T&Cs Risk adjusted Whole Life Cost (WLC) plots for price Combined Operational Effectiveness & Investment Appraisal (COEIA) plots for Value for Money (VfM) Slides on major concerns, issues and ‘red flags’

  10. MCA – LFE (7) Commercial Issues Central finding of recent court judgements (e.g. case “Lettings International Ltd v London Borough of Newham”): Any aspect of the Authority’s requirements or the way that it will evaluate tenders that could affect the way a bidder prepares its bid must be fully disclosed before tenders are returned. Implications may include, for example: Assessment hierarchies – fully weighted, automatic roll up of weighted scores Disclosure of assessor scoring guidance to tenderers Predefined, prescriptive COEIA methodology (i.e. can’t be tailored after tenders received). Dstl study is investigating this issue. Are we entering dangerous waters? i.e. OA/Decision Support Analysis no longer informing decision makers but effectively making the decision (by option scores and option ranking) by slavish adherence to a detailed, predefined process

  11. MCA – LFE (8) Overall LFE findings Well understood/accepted by Senior Decision Makers Works well with careful preparation and checks & balances on aggregation BUT Can be very resource intensive Commercial issues may prevent/limit future application of some of the LFE and undermine central tenet of OA to inform decision makers

  12. Structured What If Technique (SWIFT) & Hazard & Operability (HAZOP) – Intro to techniques

  13. SWIFT – Intro to Technique (1) Structured brainstorming method for analysing process or system Usually applied to systems or processes not deemed to be safety critical but which do have safety related failure modes Approach Select system, subsystem or process Chair poses pre-planned ‘What if’ questions, identified via Task analysis Basis of Design Generic Checklists Process Description Standards, regulations & guidelines Past incidents & accidents Multi disciplinary team (design, operation & maintenance) of SMEs answer ‘What if’ questions Results captured in log sheet by scribe

  14. SWIFT – Intro to Technique (2) Example SWIFT Log sheet Step 2 Step 1

  15. HAZOP – Intro to Technique (1) Considers the operation (operability) of a system or subsystem Systematically identify deviations from the design intent which could lead to Hazards or operability problems Multi-disciplinary SME team activity Makes recommendations which could influence the design Approach Use Guide Word / attribute combination to suggest possible deviation(s) Discuss and agree all credible causes Identify all consequences, identifying Hazards List existing safeguards Discuss and agree further actions Record Repeat until complete

  16. HAZOP – Intro to Technique (2) Example HAZOP Log sheet

  17. SWIFT & HAZOP - LFE

  18. SWIFT & HAZOP – LFE (1) Preparation Key to effective workshop Don’t circulate ‘what if’ Qs in advance to help avoid preformed views Ensure reference material to-hand during workshop e.g. large-scale design drawings Attendees Number: No more than 8, including Chair & Scribe Expertise: Full coverage required (tension with number!) Chair: Strong/assertive Independent of the project Ideally knows attendees to draw them out, but not too well (bias/favouritism) Have sufficient technical knowledge to understand the discussion and be able to apply judgement when directing the study, e.g. distinguishing between superficial/trivial issues and important issues Scribe: Understands domain and technical jargon

  19. SWIFT & HAZOP – LFE (2) SWIFT vs HAZOP SWIFT Don’t have to have finished design i.e. can apply to concepts Quick and easy to implement (cf HAZOP) HAZOP Mature design needed Exhaustive analysis, completeness of analysis Time consuming/resource intensive

  20. SWIFT & HAZOP – Potential Non Safety Applications

  21. SWIFT & HAZOP – Potential Non Safety Applications Explore/define CONEMP Develop OA scenarios & vignettes Explore and assess merits of business change programme General risk identification & analysis

More Related