1 / 264

Prepared By: Peter Romanov ( NOAA-CREST/CUNY ) Ken Jensen (Raytheon) Rory Moore (Raytheon)

Southern Hemisphere Automated Snow/Ice (ASI) Stage 1 Test Readiness Review March 15, 2012. Prepared By: Peter Romanov ( NOAA-CREST/CUNY ) Ken Jensen (Raytheon) Rory Moore (Raytheon). 1. ASI Stage 1 TRR Agenda.

evette
Download Presentation

Prepared By: Peter Romanov ( NOAA-CREST/CUNY ) Ken Jensen (Raytheon) Rory Moore (Raytheon)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Southern Hemisphere Automated Snow/Ice (ASI)Stage 1 Test Readiness ReviewMarch 15, 2012 Prepared By: Peter Romanov (NOAA-CREST/CUNY) Ken Jensen (Raytheon) Rory Moore (Raytheon) 1

  2. ASI Stage 1 TRR Agenda Project Plan 9:00 – 9:15 Ken Jensen CDR Report 9:15 – 9:30 Ken Jensen Unit Test Plan (I) 9:30 – 10:15 Ken Jensen Unit Test Plan (II) 10:15 – 10:30 Peter Romanov ADP Unit Test Readiness 10:30 – 10:45 Peter Romanov Break 10:45 – 11:00 SA/AF/AU Test Readiness 11:00 – 11:30 Peter Romanov CRM Unit Test Readiness 11:30 – 11:45 Peter Romanov Risks and Actions 11:45 - 12:50 Ken Jensen Summary and Conclusions 12:50 - 1:00 Ken Jensen 2

  3. PROJECT PLAN • CDR REPORT • UNIT TEST PLAN • ADP UNIT TEST READINESS • SA/AF/AU UNIT TEST READINESS • CRM UNIT TEST READINESS • RISKS AND ACTIONS • SUMMARY AND CONCLUSIONS

  4. Section 1 – Project Plan Presented byKen Jensen (Raytheon) STAR Process Lead / QA

  5. ASI Product Team Development Lead: Ivan Csiszar Product Area Lead: Sean Helfrich Development Scientist: Peter Romanov Development Programmer: Peter Romanov Development Tester: Peter Romanov QA/Process Lead: Ken Jensen 5

  6. ASI - Development Project Plan The Development Project Plan (DPP) is a standard artifact of the STAR EPL process. The DPP identifies project objectives, stakeholder roles and tasks, resources, milestones, schedule, and budget TRR reviewers can access this document at http://www.star.nesdis.noaa.gov/smcd/emb/ASI/ASIdoc_TRR.php Guidelines for the DPP are found in STAR EPL process asset DG-5.1 TRR reviewers can access this document at http://www.star.nesdis.noaa.gov/star/EPL_index.php 6

  7. ASI Project Plan – Changes Since CDR Rory Moore added to Development team Sean Helfrich replaces Liqun Ma as OSPO PAL Revisions to review teams Revision to TRR entry criteria 7

  8. ASI Stakeholders – Suppliers OSD: Development funding CREST : 0.7 FTE for programming and testing OSPO: Metop AVHRR and MSG SEVIRI data needed for the project implementation is available from DDS through McIDAS (Contact: Truc Nguyen) OSPO: DMSP SSMIS data needed for the project implementation is available from DDS (Contact: Hanjun Ding) NCDC: In situ data is available via anonymous ftp 8

  9. ASI Stakeholders – Developers Development Lead: Ivan Csiszar (STAR) IPT Lead: Sean Helfrich (NIC) Development Scientist: Peter Romanov (CREST) Development Programmer: Peter Romanov Development Tester: Peter Romanov Development Documents: Peter Romanov, Ken Jensen (Raytheon), Rory Moore (Raytheon) 9 9

  10. ASI Stakeholders – Operators Product Area Lead: Sean Helfrich (OSPO/NIC) Installation and Acceptance: Sean Helfrich, William Pennoyer (SSAI) Operations: Richard Brooks (OSPO) Science Maintenance: Peter Romanov Reactive Maintenance: William Pennoyer O&M Documentation: Sean Helfrich, William Pennoyer 10 10

  11. ASI Stakeholders – Users NCEP EMC Land Group: Contact (Mike Ek) Update product requirement, evaluation of product quality, and at least one of the product development reviews (e.g., CDR). National Ice Center (NIC): Contact (Sean Helfrich) Technical advisor and at least one of the product development reviews (e.g., CDR) NCDC: Contact (Phil Jones) Work with OSPO to develop SA for CLASS 11

  12. ASI Stakeholders – Reviewers (1) Gate 3 Review Lead: Sean Helfrich (NIC) Gate 3 Review Team: Liqun Ma (OSPO) Ken Jensen (Raytheon) Tom Schott (OSD – consultant) Mitch Goldberg (STAR – consultant) Stage 1 CDR Review Lead: Ivan Csiszar Stage 1 CDR Review Team: Sean Helfrich Liqun Ma Ken Jensen Zhaohui Cheng (OSPO QA) George Lawless (OSPO Security) Consultants: Tom Schott, Fuzhong Weng (STAR), Joe Mani (OSPO IT), Mike Ek (NCEP) 12

  13. ASI Stakeholders – Reviewers (2) Stage 1 TRR Review Lead: Zhaohui Cheng Stage 1 TRR Review Team: William Pennoyer (SSAI) Ken Jensen George Lawless Sean Helfrich Stage 1 SRR Review Lead: Sean Helfrich Stage 1 SRR Review Team: Ken Jensen Ivan Csiszar Zhaohui Cheng George Lawless Jerry Zhan (LSPOP) Consultants: Tom Schott, Fuzhong Weng (STAR), Joe Mani (OSPO IT), Mike Ek (NCEP), James Holton 13

  14. ASI Stakeholders – Reviewers (3) Stage 2 CDR Review Lead: TBD Stage 2 CDR Review Team: TBD Stage 2 TRR Review Lead: TBD Stage 2 TRR Review Team: TBD Stage 2 SRR Review Lead: TBD Stage 2 SRR Review Team: TBD 14

  15. ASI Stakeholders – Management and Support Project Management: STAR Division Chief: Fuzhong Weng STAR Branch Chief: Ivan Csiszar OSPO Branch Chief: Ricky Irving OSD/SEID: Tom Schott Configuration Management: Min Li Data Management: Min Li Quality Assurance: Ken Jensen (Raytheon) Zhaohui Cheng (OSPO) 15 15

  16. ASI IMP – Project Milestones Gate 3 Review – Sep 14, 2010 Stage 1 Critical Design Review – Nov 1, 2011 Stage 1 Test Readiness Review – Mar 15, 2012 Stage 1 System Readiness Review – May 16, 2012 Stage 1 Delivery to Operations – May 31, 2012 Stage 2 Critical Design Review – Aug 16, 2012 Stage 2 Test Readiness Review – Oct 17, 2012 Stage 2 System Readiness Review – Mar 21, 2013 Stage 2 Delivery to Operations – Apr 12, 2013 16

  17. ASI Stage 1 Timeline – Step 9 Is Completed CMMI Development to Transition to Operations (Stage 1) Gate 3 09/16/10 Start 06/14/10 Delivery 05/12 Operations 07/12 SRR 04/12 CDR 11/1/11 TRR 03/12 • Jun 14, 2010: Began deployment of CMMI practices during step 5 of STAR EPL product life cycle • Jun - Sep, 2010: CMMI-compliant ASI Development Project Plan (DPP) • Sep 16, 2010: Gate 3 Management Review. ASI project successfully completes PLAN Phase. • Nov 1, 2011: Combined Requirements/Design Review (CDR). Conclusion of DESIGN phase. • Mar 15, 2012: Test Readiness Review (TRR). WE ARE HERE. • Mar 2012: Pre-operational code testing in STAR Development Environment • Apr 2012: System integration and testing in STAR Test Environment • May 2012: Combined Code Test Review/System Readiness Review (CTR/SRR). Conclusion of BUILD Phase. • May 2012: ASI integrated pre-operational system delivered to OSPO • July 2012: SPSRB Operational Decision, followed by commencement of operations. 17 17

  18. TRR Guidelines and Check List • Guidelines for the TRR reviewers are in STAR EPL process asset PRG-9 • Reviewers can access this document at http://www.star.nesdis.noaa.gov/star/EPL_index.php • The tailored TRR Review Check List is in the Development Project Plan (DPP) Appendix C • Reviewers can access this document at http://www.star.nesdis.noaa.gov/smcd/emb/ASI/ASIdoc_TRR.php

  19. TRR Report • The TRR Report (TRRR) is a standard artifact of the STAR EPL process. • The TRR reviewers should produce this report after conducting the TRR. • The report will be an artifact for the System Readiness Review. • Guidelines for the TRRR are found in STAR EPL process asset DG-9.3 • TRR reviewers can access this document at http://www.star.nesdis.noaa.gov/star/EPL_index.php

  20. ASI TRR – Review Objectives (1) • Review changes to the project plan since CDR • Review the CDR Report • Review the software architecture • External interfaces (changes since CDR) • Software units (changes since CDR) • Context-Layer, System-Layer, Unit-Layer, and Sub-Unit-Layer data flows (changes since CDR) • Review changes to the detailed design since CDR • Review changes to the verification and validation plan since the CDR

  21. ASI TRR – Review Objectives (2) Demonstrate the test readiness of each unit in the software architecture. Provide all applicable technical data to support unit testing, including: Pre-operational code and test data Unit test plan Identify and update project risks. Make recommendations for risk mitigation plans and actions. Document the closing of all action items since the CDR. Make recommendations for open actions and new actions.

  22. PROJECT PLAN CDR REPORT UNIT TEST PLAN ADP UNIT TEST READINESS SA/AF/AU UNIT TEST READINESS CRM UNIT TEST READINESS RISKS AND ACTIONS SUMMARY AND CONCLUSIONS 22

  23. Section 2 – CDR Report Presented by Ken Jensen (Raytheon) STAR Process Lead / ASI QA

  24. CDR Report • The CDR Report (CDRR) is the approved report of the CDR reviewers: • Ivan Csiszar (STAR, Review Lead) • Sean Helfrich (OSPO/NIC) • Liqun Ma (OSPO) • Ken Jensen (Raytheon) • Zhaohui Cheng (OSPO QA) • George Lawless (OSPO Security) • Weizhong Zheng (NCEP) • The CDRR reports the status of the CDR entry criteria and exit criteria • The CDRR includes an assessment of risk items, with recommendations for risk mitigation • Status of the risk items will be addressed later in this TRR • The CDRR has established the entry criteria and exit criteria for the ASI TRR • The CDRR can be accessed at http://www.star.nesdis.noaa.gov/smcd/emb/ASI/ASIdoc_TRR.php

  25. The CDR Report Closes the CDR and Sets Up the TRR TRR Check List Future Reviews Test Readiness Review (TRR) System Readiness Review (SRR) Gate 3 Review (G3R) Critical Design Review (CDR) TRR Artifacts CDR Report and Appendix CDR Check List Disposition TRR Entry Criteria Risks and Actions TRR Exit Criteria

  26. ASI Stage 1 CDR –Entry Criteria (1) Entry # 1 - A Gate 3 Review Report (G3RR) has been written. The CDR reviewers have access to the review version of the G3RR. STATUS: PASS Entry # 2 - A Development Project Plan (DPP) has been written. The CDR reviewers have access to the review version of the DPP. STATUS: PASS Entry # 3 - An Operations Concept Document (OCD) has been written. The CDR reviewers have access to the review version of the OCD. STATUS: PASS Entry # 4 - A Requirements Allocation Document (RAD) has been written. The CDR reviewers have access to the review version of the RAD. STATUS: PASS 26

  27. ASI Stage 1 CDR –Entry Criteria (2) Entry # 5 - An Algorithm Theoretical Basis Document (ATBD) has been written. The CDR reviewers have access to the review version of the ATBD. STATUS: PASS Entry # 6 - A Software Architecture Document (SWA) has been written. The CDR reviewers have access to the review version of the SWA. STATUS: PASS Entry # 7 - A Detailed Design Document (DDD) has been written. The CDR reviewers have access to the review version of the DDD. STATUS: PASS Entry # 8 - A Verification and Validation Plan (VVP) has been written. The CDR reviewers have access to the review version of the VVP. STATUS: PASS 27

  28. ASI Stage 1 CDR –Entry Criteria (3) Entry # 9 - A Project Status Report (PSR) has been written. The CDR reviewers have access to the review version of the PSR. STATUS: PASS Entry # 10 - A Critical Design Document (CDD) has been written. The CDR reviewers have access to the review version of the CDD. STATUS: PASS Entry # 11 - A Project Baseline Report (PBR) has been written. The CDR reviewers have access to the review version of the PBR. STATUS: PASS 28 28

  29. ASI Stage 1 CDR –Project Risks • 17 open project risks are identified in the CDRR • Each risk includes a Risk History, Risk Assessment, Risk Mitigation Plan, and a list of actions to implement the mitigation plan (“associated” actions) • 84 associated actions were identified • 44 of these were closed or withdrawn by the CDRR • 18 of these are Gate 3 Review actions that remained open • 22 of these are new CDR actions that are open • The status of the 17 open risks and 40 open actions from CDR will be addressed in Section 7

  30. Exit # 1 – G3R “Conditional Pass” items have been satisfactorily disposed of. STATUS: PASS Exit # 2 – G3R “Defer” items have been satisfactorily disposed of. STATUS: PASS Exit # 3 – Stage 1 operations concept and OCD are satisfactory. STATUS: PASS Exit # 4 - Stage 1 requirements identification is satisfactory. STATUS: PASS Exit # 5 - Stage 1 requirements analysis is satisfactory. STATUS: PASS Exit # 6 - Stage 1 requirements traceability is satisfactory. STATUS: PASS ASI Stage 1 CDR –Exit Criteria (1) 30

  31. Exit # 7 - Stage 1 requirements tracking plan is satisfactory. STATUS: PASS Exit # 8 – Stage 1 algorithm theoretical basis and ATBD are satisfactory. STATUS: PASS Exit # 9 – Stage 1 software architecture and SWA are satisfactory. STATUS: PASS Exit # 10 – Stage 1 external interfaces are satisfactory. STATUS: PASS Exit # 11 – Stage 1 software detailed design and DDD are satisfactory. STATUS: PASS Exit # 12 – Stage 1 verification and validation plan and VVP are satisfactory. STATUS: PASS ASI Stage 1 CDR –Exit Criteria (2) 31

  32. Exit # 13 – Stage 1 requirements allocation and RAD are satisfactory. STATUS: PASS Exit # 14 – Stage 1 baseline and PBR are satisfactory. STATUS: PASS Exit # 15 –Project risks and actions are acceptable. STATUS: PASS Exit # 16 – Project status and PSR are satisfactory. STATUS: PASS Exit # 17 – Project is ready for the Stage 1 Build phase. STATUS: PASS CDR was closed, with 40 actions deferred to the Build phase. These will be discussed in Section 7. ASI Stage 1 CDR –Exit Criteria (3) 32

  33. ASI Stage 1 TRR –Entry Criteria # 1 - 5 • Entry # 1 -A Critical Design Review Report (CDRR) has been written. The TRR reviewers have access to the review version of the CDRR. • Entry # 2 -A Development Project Plan (DPP) has been written. The TRR reviewers have access to the review version of the DPP. • Entry # 3 -A Requirements Allocation Document revision (RAD) has been written. The TRR reviewers have access to the review version of the RAD. • Entry # 4 -A Software Architecture Document (SWA) has been written. The TRR reviewers have access to the review version of the SWA. • Entry # 5 -A Detailed Design Document (DDD) has been written. The TRR reviewers have access to the review version of the DDD.

  34. ASI Stage 1 TRR –Entry Criteria # 6 - 11 • Entry # 6 -A Verification and Validation Plan (VVP) has been written. The TRR reviewers have access to the review version of the VVP. • Entry # 7 -A Unit Test Plan (UTP) has been written. The TRR reviewers have access to the review version of the UTP. • Entry # 8 -Pre-operational code units that implement the detailed design are in the development test environment. The TRR reviewers have access to this environment. • Entry # 9 -Unit test data are in the development test environment. The TRR reviewers have access to this environment. • Entry # 10 -A Test Readiness Document (TRD) has been written. The TRR reviewers have access to the review version of the TRD. • Entry # 11 -A Project Baseline Report (PBR) has been written. The TRR reviewers have access to the review version of the PBR.

  35. ASI Stage 1 TRR –Tailored / Waived Entry Criteria • There are no tailored entry criteria • There are no waived entry criteria

  36. ASI Stage 1 TRR –Exit Criteria # 1 - 5 • Exit # 1 – CDR "Conditional Pass" items have been satisfactorily disposed of • Exit # 2 - CDR “Defer" items have been satisfactorily disposed of • Exit # 3 – Changes to the project plan since CDR are approved. • Exit # 4 – Requirements allocation changes since CDR are approved. • Exit # 5 – Changes to external interfaces since CDR are approved

  37. ASI Stage 1 TRR –Exit Criteria # 6 - 9 • Exit # 6 – Changes to the software architecture since CDR are approved • Exit # 7 – Changes to the detailed design since CDR are approved • Exit # 8 – Changes to the Verification and validation plan since CDR are approved • Exit # 9 - The Stage 1 unit test plan and UTP are satisfactory

  38. ASI Stage 1 TRR –Exit Criteria # 10 - 14 • Exit # 10 –Pre-operational code to implement the detailed design has been written according to standards and has been built into executable units. • Exit # 11 – Stage 1 unit test data are satisfactory • Exit # 12 –Project baseline and PBR are satisfactory • Exit # 13 – The TRRR documents the current status of project risks, actions and TRR exit criteria. • Exit # 14 – Project risks and actions are acceptable. Project is ready for unit testing.

  39. CDR Report Appendix –CDR Check List Disposition • The Stage 1 CDRR Appendix includes the disposition status for each of 160 CDR check list items (CLI) • 151 of the CLI received a “Pass” disposition with no identified risk. These include the CDR entry criteria and exit criteria. • 9 of the CLI received a “Defer” disposition with associated risks and actions, to be discussed in Section 7

  40. PROJECT PLAN CDR REPORT UNIT TEST PLAN ADP UNIT TEST READINESS SA/AF/AU UNIT TEST READINESS CRM UNIT TEST READINESS RISKS AND ACTIONS SUMMARY AND CONCLUSIONS 40

  41. Section 3 – Unit Test Plan Presented byKen Jensen and Peter Romanov

  42. Section 3.1 – Unit Test Plan (I) Presented byKen Jensen 42

  43. The RAS and VVP Provide the Bridge from the Design to the Test Plans DESIGN PHASE (CDR) BUILD PHASE (TRR) RAS VVP UTP STP RAD, SWA, DDD

  44. Unit Testing Unit testing is performed to confirm that the software functions as designed and produces the expected outputs It is the first instance of formal verification and validation, which is intrinsic to the Build Phase 44

  45. Unit Testing Within the Build Phase • Code development, testing, and integration is inherently iterative • Unit code is written and debugged until it can be compiled and run to produce expected outputs • Test data are developed to test the code’s functional performance and quality of outputs • Unit tests reveal deficiencies that are corrected through code refinement • Refined units are integrated into an end-to-end pre-operational system that is tested and refined until all requirements are met • The standard practices of the STAR EPL Build phase accommodate this iterative nature by including feedback loops between code and test data development, code test and refinement, and system integration and testing UNIT TESTING TO COMMENCE UPON TRR APPROVAL WE ARE HERE

  46. Verification and Validation Verification is the formal process of confirming that the requirements specified for a specific product or system are satisfied by the completed product or system. Validation is a process of evaluation, integration, and test activities conducted to ensure that the final developed system will satisfy the needs and expectations of customers, users, and operators In a well-designed system, needs and expectations are completely captured by the requirements allocation. In that case, there is no meaningful distinction between verification and validation The methods and planned activities for verification and validation of the project’s process and products constitutes the project verification and validation plan 46

  47. Project Requirements Have Been Established and Refined Established at Critical Design Review (CDR) Critical Design Document (CDD) Requirements Allocation Document (RAD) v1r0 Modified for Test Readiness Review (TRR) Test Readiness Document (TRD – this presentation) Requirements Allocation Document (RAD) v1r1 47

  48. Requirements Allocation Document Requirements Allocation Document (RAD) RAD v1r1, a TRR artifact, can be obtained at http://www.star.nesdis.noaa.gov/smcd/emb/ASI/ASIdoc_TRR.php RAD Document Guidelines in STAR EPL process asset DG-6.2 http://www.star.nesdis.noaa.gov/star/EPL_index.php The RAD contains the basic and derived requirements for the work products. RAD v1r1 includes minor modifications to the requirements, based on issues that occurred during code development (step 9) 48

  49. 0) The Southern Hemisphere Automated Snow/Ice (ASI) Stage 1development project shall adopt the standard practices of the STAR Enterprise Product Lifecycle (EPL), as established in the STAR EPL process assets v3.0 The ASI Stage 1 system shall generate a gridded Daily Snow Cover Stage 1 product, called the "Metop 2 km Product" The ASI Stage 1 system shall generate a gridded Daily Blended Snow Cover Stage 1 product, called the "Metop 2 km Blended Product" ASI Stage 1Basic Requirements (1) 49

  50. The ASI Stage 1 system shall have a data ingest capability The ASI Stage 1 system shall implement the ASI Stage 1 algorithm to generate a retrieval of Southern Hemisphere Snow/Ice maps The ASI Stage 1 system shall generate a metadata product ASI Stage 1 Basic Requirements (2) 50

More Related