1 / 12

PAST PERFORMANCE Conference

PAST PERFORMANCE Conference. Theresa Kinney. Past Performance Categories Past Performance Matrix Rating Measurements Website Feedback. Agenda. 2. Past Performance Categories. 1. Customer Satisfaction Information Distribution Contract Adherence Delivery Schedule

reese
Download Presentation

PAST PERFORMANCE Conference

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PAST PERFORMANCEConference Theresa Kinney

  2. Past Performance Categories Past Performance Matrix Rating Measurements Website Feedback Agenda 2

  3. Past Performance Categories • 1. Customer Satisfaction • Information Distribution • Contract Adherence • Delivery Schedule • 5. Program Management 3

  4. Past Performance will have a four tier system which is shown below. Color Blue = Excellent Color Green = Very Good Color Yellow = Good Color Red = Poor Past Performance Matrix

  5. This rating is based on the business activities of each Contract Holder for each category and their matrix of the past performance. Depending on the number and type of infractions the Contractor’s rating will drop down one color for each set of infractions. Rating Measurements

  6. CUSTOMER SATISFACTION Refers to customer satisfaction with Contract Holders performance including but not limited to the quality of products and services, responsiveness, and problem resolution Blue= 0-5 reports of quality issues or concerns and all resolved to customer satisfaction with little or no delay; Random QA surveys average Excellent on Contract Holders. Green = 5 or more reports of minor issues all resolved to customer satisfaction quickly; or 2 or more issues not resolved quickly; random QA surveys average Very Good Yellow= More than 5 reports of issues not all fully resolved; or a continuing pattern of 2 or more issues not resolved quickly; random QA surveys average Good. Red= More than 10 reports issues not all fully resolved; or a continuing pattern (after 3 months in the Good range) of 2 or more issues not resolved quickly; random QA surveys average Poor. Past Performance Category Scoring

  7. INFORMATION DISTRIBUTION Refers to information provided by Contract Holder to Customers through sales agents, associated companies, website, handouts and etc. Blue = All information is correct and fully articulated Green = one to two instances of incorrect or partial information and contract holder quickly resolved situation Yellow = More than two instances of incorrect or partial information and contract holder quickly resolved situation or instances of incorrect or partial information were not resolved and/or we repeated. Red = Request to fix incorrect or partial information continuously ignored. Category Scoring Continued

  8. CONTRACT ADHERENCE This section refers to adherence to contract requirements including but not limited to following quote and ordering procedures, sales training and meeting participation and timeliness on required reports and fee payments. Blue = Zero or one issue(s) occurred Green = One to two issues occurred and contract holder quickly resolved situation Yellow = More than two issues occurred and contract holder quickly resolved situation or issues occurred and were not resolved and/or were repeated. Red = Request to fix issues continuously ignored. Category Scoring Continued

  9. DELIVERY SCHEDULE The delivery schedule rating is based on two parts (1) meeting the user’s expected delivery date (default of 30 days) and (2) minimizing requests to update user’s expectations Blue = 100% deliveries within user expected delivery time and/or no more than 1 request for any order to extend an expected date and/or no excessive requests for extensions in general Green = 95-99% deliveries within user expected delivery time and/or no more than 2 occurrences requesting an order to be extended more than once and/or rare requests for extensions in general Yellow = 80-94% deliveries within user expected delivery time and/or no more than 5 occurrences requesting an order to be extended more than once and/or occasional requests for extensions in general Red = <80% deliveries within user expected delivery time and/or more than 5 occurrences requesting an order to be extended more than once and/or numerous requests for extensions in general Category Scoring Continued

  10. PROGRAM MANAGEMENT A key factor to the SEWP Program’s success is the Contract Holder’s commitment to ensuring their company properly manages the contract as evidenced through their Program Management team. This factor rates the interaction between the SEWP Program Office and the Contract Holder Program Management Blue = Continue to support the SEWP Program at 100% Green = Continue to support the SEWP Program with one or two issues that have been resolved Yellow = Continue to support the SEWP Program with three to four issues that have been resolved. Red = Not fully supporting the SEWP Program and issues have not been resolved. Category Scoring Continued

  11. www.sewp.nasa.gov Vendor Information Past Performance Past Performance on the Website

  12. Customer Survey An Customer is welcome to send the SEWP Program an email to let us know of any issues or problem they are having with the Contract Holders. An example of some of the issues would be: Quality of Product/Service Interaction and Responsiveness to Customers Problem Resolution Numerous Order Rejection Requests Feedback

More Related