1 / 56

Evaluation Procedures

Learn about evaluation procedures for contract awards, including blind evaluations, streamlined scoring, individual evaluations, and detailed debriefings. Find out how to improve fairness and optimize profit in the evaluation process.

cmattie
Download Presentation

Evaluation Procedures

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation Procedures Brian Lines, Ph.D Assistant Professor University of Kansas Dept. of Civil, Environ. & Arch. Engineering brianlines@ku.edu 785-864-6503

  2. Expertise-Driven Project Delivery (XPD) CONTRACT AWARD 0 SCOPE DEVELOPMENT PERFORMANCE METRICS PROCUREMENT & SELECTION PRE-AWARD CLARIFICATION

  3. Evaluation Procedures • Blind Evaluations – Keep it Fair • Streamlined Scoring – Keep it Simple • Individual Evaluations – Avoid Groupthink • Offer Detailed Debriefings • Plan Before You Sign Take-Away Tools • 1-5-10 Evaluation Scale • Evaluation Scoresheets • Evaluation Matrix • Debriefing Template

  4. Blind Evaluations:Keep it Fair

  5. Evaluation Concerns • How are you deciding who wins? • Do you have a favorite? Is it Fair? • Is it confusing? • Will you let me optimize my profit?

  6. Blind Evaluation – Make it Fair The evaluated proposal documents MUST NOT contain any names that can be used to identify who the Proposer is. (company names, personnel names, project names, or product names)

  7. Improving FairnessEvaluated Proposals • Non-blind evaluation process allows for bias (whether intentional or unintentional bias) • Fair evaluation process is blind/anonymous (evaluators do not know firm names) Code: Firm 103 Smith Contractors Inc.

  8. Scope Definition vs. Proposal Variation Scope Definition Poor Moderate High Variation in Evaluation Scores N = 347 projects (1,850 individual proposals)

  9. Evaluation Weights

  10. Case Study(2017 High Tech Facility) • 1+ Billion • 6 proposals • Contents included: • Cost • BIM Experience • Capacity • Innovation • Prefab • CX, QA, QC Processes • Management Approach • GMP Management • Cost Control Approach • Similar Experience • Contract Exceptions • Goals • KPI • Insurance • Safety Plan Average Size of Safety Plan 356 Pages Time Spent Evaluating Safety Plan 2.5 Weeks Safety Plan Score (Standard Deviation) = 1.52%

  11. Demand Side Management Software RFP

  12. CPMS Software Implementation RFP

  13. System Integrator RFP

  14. Organizational Change Mgmt RFP

  15. Custodial Services RFP

  16. Waste Management RFP

  17. Hydrogenerator Overhaul RFP

  18. Environmental Remediation RFP

  19. Streamlined Scoring:Keep it Simple

  20. QUESTION Evaluation Scales What types do you normally use?

  21. Sample Evaluation Team Training PresentationRFP for Financial Software System Typically 30-45 minutes once all details are included

  22. Agenda • Big picture • What am I evaluating first? (blind proposals) • What comes next? (interviews + software demos) • Evaluation procedures and logistics • Evaluation schedule

  23. Big Picture • Goal of evaluations = select the best… • Company • Project Team • Software system • Role of evaluators = don’t make a decision. • Vendors must differentiate themselves form their competitors • Focus on info / level of detail needed to make a SELECTION • There will be additional clarifications after the selection. • Selected vendor will be brought into Pre-Award Clarification • Evaluators can ask any additional questions • SMEs can be brought in to confirm any details • Additional software verification / demo can be requested

  24. Evaluation Team • Execution Methodology (2pg) • Risk Assessment – Controllable (2pg) + Non-Contr (2pg) • Value Assessment (2pg) • Evaluation team provided with Functional Response & Technical Response? Procurement • Cost Proposal • Past Performance Surveys for Project Teams • Past Performance Surveys for Software Weights • 10% • 20% • 10% = 40% total Weights • 15% • 2.5% • 2.5% = 20% total What am I evaluating first?step1 = Anonymous Proposals

  25. Weights • 20% • 20% = 40% total Shortlisting procedures • If necessary, a shortlist will be determined • Shortlisted firms will be notified and invited to the interviews + verifications • Evaluation team provided with Functional Response, Technical Response, Vendor Project Team, List of Subs (if any), and all References (surveys) Evaluation Team • Interviews with each Project Team • Software Verification with each Vendor’s system What comes next?step2 = interviews & software verifications

  26. Evaluation Summary

  27. ABOVE AVERAGE (clear justification, differentiation above the average, demonstrates high performance capability, definitely want to interview/demo) 10 9 8 7 6 5 4 3 2 1 1-5-10 Rating Scale High performance range AVERAGE (difficult to differentiate, no significant difference, acceptable to meet requirements, could interview/demo) Average level of expertise BELOW AVERAGE (clear differential below the average, does not demonstrate high performance capability, don’t need to interview/demo) Low Performance Range

  28. Evaluation Scoresheets

  29. Procurement Schedule – Internal

  30. Individual EvaluationsAvoid Groupthink

  31. QUESTION Evaluation Procedures How should the evaluation process be conducted? Group vs. Consensus vs. Individual Evaluation Scores?

  32. Evaluation Committees • 3-7 individuals • Knowledge of the project and scope • Represent various perspectives • Beware of potential bias • Preferences for certain vendors • Previous work history • Overly technical

  33. Evaluation procedures: • Anonymous proposals • Focused on Clear Differential • Independent • Prohibit individuals from “lobbying” others • Treat each evaluation category separately (rather than discussing an overall ranking) • Use an Evaluation Matrix to compile the total points Evaluation procedures & logistics

  34. Simple Scoring Methodolgy

  35. Who do you bring forward to Interviews?

  36. Interview Scores

  37. What to Look For… • Actionable plans = step-by-step solutions • Is there an actual solution or do they say they will figure it out later (“to be determined…”)? • Risks should be listed by priority • Prioritization is important – Budget, Schedule, Quality are often critical • Are the risks project-specific? Or generic? • Generic risks include: communication, teamwork, landscape, weather, safety • Is it clear and easy to read? Does it make sense from an Owner’s perspective? • Do they use marketing information? • How often is their solution just to do what you’d expect them to do?

  38. Differentiators Red Flags • Risk priority is arbitrary or misaligned with Client concerns • General / Generic Risks • No Plan (we will figure it out later) • “Will survey & verify upon award” • “Will provide solutions” • “Will communicate with…” • Long, detailed, technical, confusing • Marketing information Indicators of Expertise • Risks prioritized in order of importance • Project-specific risks • Actionable steps to mitigate issues • Planning for project delivery & execution • Clear & easy to read • Identifies what support is needed from the client

  39. What to Look For… It’s about what they say (the type of risks identified and how they’re prioritized) - AND - how they say it ! (solutions with actionable steps and potential impacts to budget, schedule, or quality)

  40. Offer Detailed Debriefings

  41. Debrief Process • Offer to ALL bidders – successful & unsuccessful • Include language in the RFP to offer this. • Notify bidders at each step that they can request a debrief (i.e. if cut at the short-list or upon final selection notification, etc.) • Conduct debriefs in a discussion format – it is important to walk through it with the vendor.

  42. Debrief Document

  43. Advantages of Debriefings • Get better proposals next time!!!

  44. What are Evaluators Seeing? • Evaluators are looking for justification. • If vendors do not differentiate themselves, evaluators are trained to give vendors equivalent scores. • It is NOT about convincing the Owner to hire you with marketing information. • It IS about telling the Owner how you’ll execute the project (this is how to show your expertise and your understanding of the project requirements)

  45. Evaluation Comments - Examples LOW Proposal Scores • “No risks related to the building itself. Many risks were related to RFI processes (risk 4) and transferring risk solution back to the Owner, particularly risks 1, 3, and 5 which stated a solution of ‘meeting with the owner’ without offering options or recommending best approaches” – $32M Construction • “A great deal of delineation from our stated program requirements. For example, item 2 indicated they are not an expert in the reactor approach (did not know the long term desire for decommissioning?)” – $240M building re-design • “No statements that related to the specific requirements…” • “Inaccuracy of information vs. the RFP documents in risks 3-7.” • “Identified issues but solutions were vague…fairly boiler plate…” • “Overly selling us, ‘we want to deliver your vision’…did not provide any actionable steps in risks 1-3”

  46. Evaluation Comments - Examples HIGH Proposal Scores “identified structural elements and specific solutions for building code concerns” “good information on floor plates in Controllable Risk item 3…” “Demonstrated phasing and potential impacts” – $30M Construction “…explained connectivity between Light Rail link, building entrance, and building circulation considerations.” – $240M building design • “Noted the costs for the laboratory structure and have included it within their base bid – good! Others have either excluded this or not commented.” – $30M DB Construction “Risks 2-4 demonstrate good recognition of existing building potential issues & solutions. That will benefit the construction schedule as well as building life” –$25M construction renovation

  47. Evaluation Comments High Interview Scores • Lead Arch: “Was able to draw out their project plan & presented timeline with datesas follows: 1) info gathering for Feb 17 visioning session… 2) June to SD for building use & tie to fit scenarios/alternatives… 3) DD January 2013 tie to code, integrate Light Rail link to circulation plan, atrium usage… • Mechanical: “Provided their project plan and schedule – and it and aligns with Arch team!!! Obviously worked closely together in review of their proposal.” • “All members of Proponent team have commented on getting mechanical earlier (cores, window areas, etc.)”

  48. Plan Before You Sign

  49. Plan before you Sign • Immediately address any major cost discrepancies (1-2 days)[if applicable] • Kickoff Meeting: Contractor presents their plan • identifies anything they need from owner and WHEN • set clarification phase schedule • discuss any “red flags” and/or “design concerns” • Finalize project deliverables • Summary Meeting

  50. Greatest Sources of Project Change Delays (schedule) Change Orders ($)

More Related