1 / 111

SOURCE SELECTION TRAINING (EXECUTION) <acquisition title>

SOURCE SELECTION TRAINING (EXECUTION) <acquisition title>. <Trainer’s name> <Trainer’s organization>. Introductions. Facilities Orientation Classroom Etiquette Start/Stop Times Blackberry’s/PDA’s – Please Turn OFF! Class interaction is encouraged - - this is not a lecture! Introductions.

kristiea
Download Presentation

SOURCE SELECTION TRAINING (EXECUTION) <acquisition title>

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SOURCE SELECTION TRAINING (EXECUTION) <acquisition title> <Trainer’s name> <Trainer’s organization>

  2. Introductions • Facilities Orientation • Classroom Etiquette • Start/Stop Times • Blackberry’s/PDA’s – Please Turn OFF! • Class interaction is encouraged - - this is not a lecture! • Introductions

  3. Overview • Learning Objectives • RECAP – What Team Has Accomplished • Basic Purpose of SS • Source Selection Process Roadmap & AF SS Process • Schedule • Source Selection Organization • SSO Roles & Responsibilities • Initial Evaluation Phase • Proposal Receipt, SS Documentation & File Mgm’t, EZ Source Tool, Basic Evaluator Guidelines • Initial Evaluation Activities • Technical, Past Performance, Cost/Price, ENs, Comp Range • Pre-FPR Evaluation Activities • Comp Range Elimination, Discussions, Request for Final Proposal Revision • Final Evaluation Activities • Proposal Evaluation, SSEB PAR, SSAC Comparative Analysis Report, Decision Brief • Award and Debriefing Activities • Records Retention • Ethics and Procurement Integrity • What is Source Selection Information? • References • Summary

  4. 0800 – 1130 Introductions/ Overview/Schedule Learning Objectives RECAP Source Selection Organization Source Selection Process Initial Evaluation Activities 1130 – 1230 LUNCH 1230 – 1600 Initial Evaluation Activities (cont.) Pre-Final Proposal Revision Evaluation Activities Final Evaluation Activities Breaks – As Required ExampleWorkshop Schedule – Day 1

  5. 0800 – 1130 Recap from Day 1 Award Contract and Debriefing Activities Mock Source Selection Activity (Optional) 1130 – 1230 Lunch 1230 - 1600 Ethics and Procurement Activities Summary Breaks – As Required Example Workshop Schedule – Day 2

  6. Learning Objectives • After completion of this training AF Source Selection Teams will understand: • Roles and responsibilities • Processes/procedures to evaluate proposals • Methodology and requirements for creation and maintenance of documentation • Approaches and tools available • Necessity of conducting thorough and timely debriefings • Importance and applicability of ethics and procurement integrity rules throughout the process

  7. Basic Purposeof Source Selection • To select the contractor (the source) • Whose proposal meets our requirements and provides the best value • “Best Value” means expected outcome of an acquisition that provides greatest overall benefit in response to requirement • That is most likely to perform all requirements • That is best able to handle problems that might occur during contract performance • That provides most realistic plan for contract performance

  8. Recap – What Team Has Accomplished A successful Source Selection requires a team that is engaged and prepared • Issued final Solicitation/RFP incorporating • Requirements Documents • Source Selection approach • Evaluation Methodology • Key aspects of approved Source Selection Plan • RFP is compliant with approved Acquisition Plan • Established Source Selection Team • All members have reviewed Source Selection Plan and RFP • All members and advisors have signed Non-Disclosure Agreements at start of SS and COIs after receipt of offerors’ proposals

  9. Air Force Source Selection Process – Proposal Evaluation

  10. Schedule <Trainer to insert the Source Selection Milestone Schedule

  11. Source Selection Organization <Trainer to insert the Source Selection Organization

  12. SSA - Role and Responsibilities Responsible for Proper and Efficient Conduct of Source Selection Process

  13. PCO - Roles and Responsibilities Primary Business Advisor and Principal Guidance Source Throughout Source Selection

  14. SSAC - Roles and Responsibilities SSAC Established by SSA to Gain Functional Expertise and Support

  15. SSEB Chairperson – Roles and Responsibilities SSEB Chair Manages Evaluation Team and Acts as Interface to SSA and SSAC

  16. SSEB Evaluators – Roles and Responsibilities SSEB Evaluators are Generally Organized Into Functional Areas

  17. Advisors – Roles and Responsibilities Two Distinct Types Of Advisors: Government And Non-government

  18. Initial Evaluation Activities - Proposal Receipt PCO ensure:

  19. Proposal Receipt (cont.) <Trainer to update past performance information based on RFP> • What will you use to capture evaluation and subsequent documentation? • What Source Selection software (EZ Source or other) is your team using? • Has all proposal data been loaded? • Have team members been adequately trained? • Have past performance volumes been received? • Will past performance team use past performance questionnaires?

  20. SS Documentation & File Management Source Selection Documentation Guidance • Emphasize standardization for configuration control • Establish ROEs for data management & accountability • Increases ability to audit & establish completeness of record • Fully understand capabilities & limitations of SS tools • SS tool does not contain entire record • Appoint SS Records Custodian • Recommended best practice • Document management, file structure, and disposition

  21. EZ Source Tool • Mandatory Use Criteria for Competitive Acquisitions • Value $50M or greater • Hi-Vis Programs • Consolidation of individual contracts • Re-competition as a result of sustained protest (corrective action) • Political sensitivity • PEO or CC deems high-risk program regardless of $$ value or ACAT level • Deployment of EZ Source will be accomplished in two spirals over next 18 months (beginning Jun 2013) • Spiral I locations meeting mandatory criteria must use EZ Source if solicitation is issued on or after location’s effective date (Policy Memo 13-C-02) • Spiral II - all remaining locations (estimated Oct 2014??)

  22. Source Selection Tools POCs • EZ Source tool (AFMC) • POC: Kerry Estes, DSN 785-5471, kerry.estes@wpafb.af.mil EZ Source Templates • Electronic Source Selection (ESS) tool (SMC) • POC: Mr. Ted Nguyen, DSN 633-5072, ted.nguyen@losangeles.af.mil • Decision Point • See www.acqcenter.com for more information • Fed Select • See www.caci.com • POC: Steve Ford, (703) 486-3266 ext 1031, sford@caci.com

  23. Initial Evaluation Phase • Each Evaluation Team assesses proposals for its factor – Technical, Past Performance, Cost/Price • Cost/Price Factor is evaluated IAW Section M, but not rated • Individual evaluators generate facts and findings in their assigned areas and record them on worksheets • Team Leaders combine information from individual Worksheets into Summary Sheets and then rate their subfactors and factors • Team Leaders generate their portion of the SSEB Proposal Analysis Report (PAR) using Summary Sheets as a basis • SSEB PAR and Summary Sheets are used to generate SSAC/SSA briefing slides

  24. Flow of Evaluation Findings SSAC BRIEF

  25. Basic Evaluator Guidelines • Perform your own independent assessments, use your knowledge and experience • Thorough narratives are essential to producing a fair, consistent, fact-based and discriminating evaluation • Basis for SSAC/SSA trade-off analysis • Narratives tied to proposal and RFP requirements • Assessments clearly describe benefits and/or detriments • Coordinate & communicate among teams through Team Leaders • Assure integrated evaluation of overlapping areas • Inform others of info that may help evaluation of their areas • Talk with fellow evaluators and Team Leader daily; however don’t let anyone unduly influence your evaluation

  26. Initial Evaluation Activities • Team Leads: • Have all evaluators been trained? • Are evaluators ready to begin this task? • Have you designated specific forms or worksheets? • How will you evaluate offeror’s proposals? • All evaluators review one proposal before moving to next OR • Each evaluator reviews only one proposal OR • Each evaluator reviews by subject matter OR • Different approach? • How will you decide an evaluator’s write up is sufficient?

  27. Initial Evaluation Activities (cont.) • Team Leads: does your approach . . . • Ensure integrated assessment of proposal? • Facilitate evaluation against published criteria? • Prevent comparison of proposals? • Encourage efficiency and consistency? • Prevent transfer of one offeror’s technical solution to another • Evaluate what we asked for, not what we want to see or expect to see? • Guarantee every write up by every evaluator is sufficient for every requirement? Recommended Approach – Complete Evaluation of One Proposal Prior to Moving to Next One

  28. Initial Evaluation Activities (cont.) • Team Leads: What is your preference for … • Collaboration among your evaluators? • Do you want them to discuss any issues as they review? OR • Do you want to have them wait until they have completed reviewing each offeror’s proposal? OR • Do you want to have them collaborate at subfactor level? OR • Do you want to wait until all offeror’s have been evaluated? OR • Different approach? There is no one right answer!

  29. Initial Evaluation Activities (cont.) • Team evaluators: must ensure your evaluation… • Does not expand scope of published criteria or change requirement • Considers only information contained in proposal (except PP) • Evaluates proposals solely against Evaluation Factor/Subfactors established in RFP (Sections L & M) and not against one another • Thoroughly explains in plain language, why/how proposal does or does not meet applicable requirement • Is consistent across all proposals • Is so completely documented it will withstand a protest on its own

  30. ‘Quick Look’ Status Update • After receipt of proposals, SSEB Chair may update SSA with a Quick Look status brief/report • Provides SSA with an initial summary of proposals • Name(s) of offerors • Compliance of proposals with RFP • Proper page count • Identification of any exceptions taken to RFP requirements • Any obvious showstoppers Recommended Practice – May be done in conjunction with SSA Training

  31. Initial Evaluation Activities Technical Evaluations How-To Briefing for Source Selection Evaluators

  32. Technical Evaluations <Trainer to insert Section L information for Technical Factor(s)> Types of information you expect to receive from offerors …

  33. Technical Evaluations (cont.) <Trainer to insert Section M information for Technical Factor(s) without rating information> The criteria by which you will evaluate the offerors’ proposal …

  34. Technical Evaluations (cont.) • All strength(s) must be clearly documented! • Documentation should describe magnitude of benefit(s) to Government • DoD Source Selection Procedures guide defines a strength as: “ An aspect of an offeror’s proposal that has meritor exceeds specified performance or capability requirements in a way that is advantageous to the Government during contract performance.” • When a strength is identified, assess whether offeror’s proposed approach would likely cause an associated weakness (risk) which may impact schedule, cost, or performance

  35. Technical Evaluations (cont.) • All weakness(es) must be documented • A Weakness is defined as a flaw in the proposal that increases the risk of unsuccessful contract performance • Any/all deficiencies must be documented • A Deficiency is defined as a material failure of a proposal to meet a Government requirement or a combination of significant weaknesses in a proposal that increases the risk of unsuccessful contract performance to an unacceptable level

  36. Technical Evaluations (cont.) • Technical Team Leads, must … • Have an approach to consolidate reviews and analysis of evaluators • Have a plan to reach consensus with all evaluators on each subfactor • Do you have plan to document consensus results including disposition of any disagreement by individual evaluators? • Consolidate reviews and analysis of evaluators? • Have a plan to elevate disagreements when consensus is not reached • Review evaluator’s analysis for completeness and clarity • Interface with Past Performance & Cost/Price Teams

  37. Technical Evaluations (cont.) • Team Leads: What is your process for assigning ratings? • Will ratings be assigned by: • Individual evaluators? • Team leads? • SSEB Chair? Recommended Approach – Individual evaluators and team leads “recommend” ratings and SSEB Chair “assign” ratings

  38. Initial Evaluation Activities Past Performance GAO case on Past Performance “Unknown” rating:

  39. Past Performance <Trainer insert Acquisition Section L information for Past Performance Factor> What types of information do you expect from Offerors?

  40. Past Performance (cont.) <Trainer should insert acquisition’s Section M and discuss Past Performance evaluation factor without rating information Any questions on Past Performance criteria you are using?

  41. Past Performance Versus Responsibility Determination Responsibility Determination: PCO Past Performance Evaluation: PPET • Does offeror have capability to perform? • Financial capacity • Requisite facilities, etc. • Track record of integrity and business ethics • Appropriate systems in place (accounting, property, quality, etc.) • Pre-award surveys useful in providing detailed analysis • Results in PCO determination of responsibility • Identifies degree of government confidence for each competing offeror • Will offeror do the work successfully? • Based upon its track record of recent, relevant performance • Results in Acceptable/ Unacceptable Rating or Performance Confidence Assessment Rating

  42. Past Performance Rating Process Obtain Past Performance Information on Each Offeror and Determine Recency Assess Relevancy for Each Recent Contract Review Offeror Relevancy and Performance Ratings Assign Offeror’s PAST PERFORMANCE FACTOR Acceptable/Unacceptable or Confidence Rating

  43. Past Performance (cont.) • Past Performance evaluators must… • Understand various approaches to obtain Past Performance Information (PPI) • Evaluate Recency of PPI • Include explanation why each identified PPI complied with or failed to meet criterion • Identify those not further considered • Evaluate Relevancy of PPI • Use solicitation’s relevancy definitions to determine relevance for each factor or subfactor, as appropriate • Include complete explanation why factor or subfactor was determined within selected relevancy definition • When determined “not relevant” indicate instance not further considered

  44. Past Performance (cont.) • Past Performance evaluators must… • Assess quality of information received from various sources, i.e. references/questionnaires, CPARS/FAPIIS, etc. • Ensure records clearly explain and support assigned rating • Does your documentation identify why team concurs or differs with reported performance from past performance report (CPARs, questionnaire responses, etc.)? • While information from past performance report often forms basis of assigned performance rating, they do not have to be the same Note: Discussion and research may result in a different opinion of contract performance. When performance assessment differs from past performance report, record must document rationale for difference.

  45. Past Performance (cont.) • Good Practices • Develop clear records that explain why summary Technical subfactor and Cost/Price factor (if assigned) and overall Past Performance acceptable/unacceptable rating or confidence assessments were assigned • Identify why team assigned various acceptable/unacceptable or confidence ratings • Establish rating using an integrated analysis of positive and negative performance indicators identified for Technical factor or alternatively for each subfactor and Cost/Price factor • Develop comprehensive observations and conclusions that support overall Past Performance factor of acceptable/unacceptable or confidence assessment rating

  46. Initial Evaluation Activities Cost and Price Evaluation: Cost/Price

  47. Cost/Price <Trainer insert Section L & M information for Cost/Price> What types of information do you expect from Offerors and how you plan to evaluate?

  48. Cost/Price (cont.) • Team Lead • Consider topics addressed in technical evaluation charts to decide which apply to Cost/Price evaluation team • Consider proposal requirements and evaluation criteria outlined in Sections L & M to decide what steps must be taken to complete evaluation • Must evaluate as outlined in Solicitation • Coordinate with evaluators in other areas to ensure consistency between proposed costs & other parts of proposal

More Related