1 / 75

Space Network Access System (SNAS) Operational Readiness Review (ORR)

Space Network Access System (SNAS) Operational Readiness Review (ORR). September 4, 2008 NASA Code 452 Space Network (SN) Project. ORR Agenda. Welcome/Introduction Rose Pajerski Operational Hardware Tom Holub Acceptance Testing Merri Benjamin WSC IT Security Evaluation Rich Pearce

Patman
Download Presentation

Space Network Access System (SNAS) Operational Readiness Review (ORR)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Space Network Access System (SNAS) Operational Readiness Review (ORR) September 4, 2008 NASA Code 452 Space Network (SN) Project

  2. ORR Agenda • Welcome/Introduction Rose Pajerski • Operational Hardware Tom Holub • Acceptance Testing Merri Benjamin • WSC IT Security Evaluation Rich Pearce • SNAS Training Dave Warren • Software Sustaining Plan Chii-der Luo • WSC Operations & Maintenance Marty Antholzner • Post ORR activities Rose Pajerski

  3. Purpose of Review • The objectives of the review are to provide a comprehensive briefing to • Review SNAS requirements and results of acceptance testing activities • Present transition plans and activities • Brief on the system, personnel and documentation readiness for transition • Demonstrate the system’s readiness to support NCCDS operations at WSC • Readiness criteria • All applicable functional, unit, subsystem testing, etc. has been successfully completed. • All RFAs / issues from previous reviews have been satisfied according to plan. • All known significant system discrepancies have been identified and dispositioned • Interfaces under configuration management • Operations personnel identified, trained and ready • Upon completion of the review, the SNAS team seeks approval to proceed with transition to operations for NCCDS interface

  4. ORR Objectives • ORR Presentation will address each item below: • All validation testing is complete • Test failures and anomalies from validation testing have been resolved • All operational and enabling products necessary for normal and contingency operations have been tested and in place • Operations handbook has been approved • Training has been provided to users and operators • Operational contingency planning has been accomplished, and all personnel have been trained

  5. Welcome / Introduction Rose Pajerski

  6. Review Board • Joe Stevens, (chair), NASA GSFC Code 730 • Joe Aquino, NASA JSC ISS/SST • Markland Benson, NASA WSC Code 583 • Dave Campbell, NASA GSFC Code 584, HST • Curtis Emerson, NASA GSFC Code 530 • Lynn Myers, NASA GSFC Code 450 6

  7. Prior Milestones • SNAS Release 1 Milestones • System Requirements Review 7/08/03 • Delta-System Requirements Design Review 4/28/05 • Preliminary Design Review 9/12/05 • Critical Design Review 5/04/06 • Implementation 6/5/06 - 7/20/07 • Integration/System Testing 8/07 - 5/08 • Beta Testing 8/07 - 8/08 • Server Shipment to WSC 5/06-12/08 • System Installation and Checkout 5/14-30/08 • Test Readiness Review 6/6/08 • NCCDS Acceptance Testing 6/23-8/29/08 • Customer Training 7/08-8/08 • WSC Operations Training 8/26/08 • Performance test 8/28/08 7

  8. Moving Forward Schedule Scheduled Completion • Operational Readiness Review today • Operational Availability (NCCDS) 9/23/08 • SNAS/DAS Acceptance Testing Complete Oct08 • Delta-ORR for SNAS/DAS Oct08 • Customer Training (CIM #9, WSC, customer sites) Nov08 • Operational Availability (DAS) TBS • DAS Customers Transition to SNAS (as a group) TBS • End SWSI and UPS fixes/releases TBS 8

  9. Customer Community Milestones • Group Customer Interface Meetings • April 2005 (pre-SRR) through July 2008 (AT) • Beta Testing • Participants: NOMs, WSC, HST, JSC, TRMM, SP&M, SPTR • Release 1: 4/07 – 8/08 • Acceptance Testing • MOC operational data sets: HST, JSC, TRMM, Landsat, SWIFT • MOC operational scenarios: JSC, SP&M, SWIFT • WSC AT participants: JSC ISS & Shuttle, NOMs, Development Team (performance) 9

  10. Project Documentation System Requirements Doc. (DCN 002) CCB approved (4/30/08) Operations Concept Doc. (DCN 002) CCB approved (11/15/06) ICD between DAS/SNAS CCB approved (5/03/06) ICD between EPS/SNAS CCB approved (11/15/06) ICD between SN/CSM (DCN 002) CCB approved (10/29/07) Security Documentation 452 Approved (10/29/07) System Test Document Final, 5/16/08 Acceptance Test Plan Final, 6/04/08 MOC Client Users Guides Draft to CCB O&M Client Users Guides Draft to CCB Server Operators Guide Draft to CCB 10

  11. Operational Server Hardware Tom Holub

  12. Hardware Status • SNAS Server System • Two racks • 2 Open Servers • 2 Closed Servers • 2 Closed Data Servers • Closed RAID Array • UPS • NISN-managed switches • Installed with EC-TO 060-1 • SNAS Server System connections • Open and Closed IONet complete except for Out-of-Band Connectivity • Sparing plan • One APS UPS battery • One HP system disk • One EMC Clariion NAS disk • Maintenance Plan for servers • Warranty documents have been transferred

  13. Hardware Status (cont’d)

  14. SNAS (AT) Architecture

  15. Acceptance Testing Merri Benjamin

  16. Acceptance Test Overview (1 of 3) • Testing Overview – Scope • Both MOC and O&M Client functionality • SNAS/NCCDS interface (ANCC) • SNAS/DAS interface (HMD DAS) • Limited EPS/SNAS interface testing • Used PC Client platform directory as pseudo-EPS • JSC EPS end-to-end with Beta System • Server operations (e.g., High Availability (HA), logging) • Exercised operational procedures (e.g., database recovery, SW delivery, etc.) • Test Cases executed and procedures exercised to verify system functional and performance requirements

  17. Acceptance Test Overview (2 of 3) • Testing Environment • SNAS Servers connected to IONet with data passed to and from the ANCC and HMD DAS • Clients connected to Open and Closed IONET • Clients hosted on Windows PC platforms only • Tested in OPS mode only (OPS database instance) • Test Support • WSC Testers - TO&A, OPS, and SE Departments • WSC SA/Oracle DBA, NCCDS and DAS - Software Department • JSC representatives (Jeremy Gibson and Mike Duffy) • Traveled to WSC • Tested recurrent scheduling, orbital input processing, and EPS functions • Performed Schedule Planning and Schedule execution for ISS, STS, and ATV (functional and performance testing)

  18. Acceptance Test Overview (3 of 3) • Status Meetings • Held AT status meetings – Mondays and Thursdays • Meeting agenda – outstanding test issues, IDR review and work-off plan • Test CM • Discrepancies maintained and tracked in CDS Remedy -IDR database • Five patches delivered • Fixes tested and regression tests • Results presented – Release Candidate 5 baseline • Acceptance Test Report will be delivered: 10/03/08

  19. Acceptance Test Results MOC Client Functionality O&M Client Functionality Server/System Testing IT Security Controls

  20. Acceptance Test Results - MOC Client (1 of 6) • 5 Test Cases - general functions for configuring User/Client environment • Login, setting session preference, workspace set-up, etc. • Results: 5 passed • 8 Test Cases - general functions for configuring Mission • MOC roles/accounts, EPS set-up, Service Codes, Orbital data, etc. • Results: 8 passed • 6 Test Cases – import data for both DAS and NCCDS customers • State vectors, TCW, TSW, and PSAT/UAV files • Results: 4 passed and 2 partial success • State vector transmits to ANCC successful; state vector transmits to HMD DAS failed

  21. Acceptance Test Results - MOC Client (2 of 6) • 9 Test Cases – NCCDS scheduling • SAR, SDR, RR, WLR, Recurrent Scheduling, TUT, etc. • Results: 9 passed with following notes • EPS-AUTONCC had Client platform as the EPS node • Replace Requests - functions when submitted through Schedule Request window, only (not Active Schedule window) ** JSC performed schedule planning for ISS, STS, and ATV • 9 Test Cases – DAS scheduling (not tested with Candidate 5) • TVR, RAR, RAMR, RADR, PBKR, etc • Results: 8 partial success and 1 Blocked with following notes • Bulk schedule files to DAS - not operationally used - no bulk file available • EPS-AUTODAS - limited to bulk state vectors ** Performed WSC TO&A DAS Regression Test each time HMD DAS was available (WSC DAS OPS with SNAS)

  22. Acceptance Test Results - MOC Client (3 of 6) • 6 Test Cases – Scheduling Tools (both NCCDS and DAS) • Active Schedule Summary, Request Schedule Summary, TSW shift, etc. • Results: 6 passed • 2 Test Cases – Real-time monitoring • UPDs, Switch User • Results: 2 passed • 3 Test Cases – Reports and Queries • Confirmed events report, Rejected/Decline report, User environment displays, etc. • Results: 3 passed

  23. Acceptance Test Results - MOC Client (4 of 6) • MOC Client Summary • Remaining Open IDRS • SNAS/NCCDS - 3 Low Priority • IDR #26506 (Low) - TSW deletes when overlap data range • IDR #26507 (Low) - TSW Msg ID validation • IDR #26451 (Low) - Replace Request submitted through Active Schedule

  24. Acceptance Test Results - MOC Client (5 of 6) • Remaining Open IDRS (cont’d) • SNAS/DAS - 2 High Priority and 4 Low Priority (Candidate 5 fixes not tested) • IDR #26288 (High) - State Vector transmits to DAS • IDR #26415 (High) - Modify on-going event • IDR #26496 (Low) - Data Format Protocol not saved in SSC • IDR #26422 (Low) - unable to modify DAS service end time using RMAR • IDR #26423 (Low) - RAMR reference request ID window does not include “ANY” • IDR #26424 (Low) - TDRS “ANY” cannot be deleted via RADR • General - 1 Low Priority • IDR #26308 (Low) - Dialog boxes do not always close as expected

  25. Acceptance Test Results - MOC Client (6 of 6) • Recommended additions to “wish list” for consideration for future releases • Add a combined Scheduler/Controller role • Add in GUI the listing of SSC/Service type naming convention • Add capability to add SSC by copying existing SSC • Add capability to delete TSW by TSW Set ID and TDRS ID • Add capability to filter Active Schedule and Schedule Request windows by date range • Change vector import and transmit windows to default to NCC • Add reload button for TSW Summary

  26. Acceptance Test Results MOC Client Functionality O&M Client Functionality Server/System Testing IT Security Controls

  27. Acceptance Test Results - O&M Client (1 of 2) • 3 Test Cases – General Client/Session Configuration • Login, View status • Results: 3 passed • 9 Test Cases - Mission and User Configurations • SSC update, SUPIDEN, TDRS, establish connections, etc. • Results: 9 passed • 2 Test Cases - System Monitoring • Results: 2 passed

  28. Acceptance Test Results – O&M Client(2 of 2) • O&M Client Summary • Remaining Open IDRS – none • Alert window will be delivered in next release • Recommended additions to “wish list” for consideration for future releases • Add capability to add SSC by copying existing SSC • Add role/access privileges – allows for system monitoring without DBA account • O&M approval window • Mark the changes being requested from the MOC Client • Allow for dialog for approval (not just rejected requests)

  29. Acceptance Test Results MOC Client Functionality O&M Client Functionality Server/System Testing IT Security Controls

  30. Acceptance Test Results - Server/System(1 of 8) • Test overview • High Availability application and configuration • System/Application Logging • SNAS/SN Interface • End-to-end testing – EPS–Client- Server-ANCC/HMD DAS • EPS to Client – same platform • Limited HMD DAS availability • System Performance and Loading Test

  31. Acceptance Test Results - Server/System(2 of 8) • High Availability (HA) Testing • Linux HA product • Runs on all three clusters (Open Servers, Closed Servers, and Data Servers) • Provides for “prime” and “backup” to meet high availability/performance requirements • Each cluster has two nodes with resources (applications) distributed • Supports two modes of operations – OPS and EIF • Tested basic operations of application with command statements and with GUI • Start, Stop, Banish, Allow, etc. • Resource failure operations • Component (node) failure operations (server powered-off) • Results – Partial Success • Some unexpected bouncing of resources between nodes • Following failure, state returns to last state not standby • Failure counts not incrementing

  32. Acceptance Test Results - Server/System (3 of 8) • System/Application Logging • Applications Logs • SAM, SvE, DSDM, SNIF, and SDIF • Logs required to contain • Source • Severity code • Timestamp field • Explanation code • Results: Pass • SNAS/SN Interface • SNAS/SN – TCP/IP protocol • Results: Pass

  33. Acceptance Test Results - Server/System(4 of 8) • End-to-end System Testing • EPS - Client - Server - ANCC/HMD DAS • Interfaces verified during Performance Test • Part of Candidate 5 regression testing • Results: Partial Success • Limited EPS to Client testing - EPS simulation not a function of ANCC (JSC verified in Beta environment) • Limited time available for SNAS/HMD DAS testing - no regression testing for Candidate 5

  34. Acceptance Test Results - Server/System(5 of 8) • Performance/Loading Test • Test Overview • Tested only SNAS/ANCC – HMD DAS not available • Ten Testers – AT Testers, TO&A, GCP NOM, SNAS Developers • Each logged into 5 MOC Client sessions – 50 concurrent Client sessions • Two O&M Client Sessions • Test duration – about 3 hours • Functions/Activities performed • Import bulk SARs • Import bulk IIRVs • Import/Transmit TSWs • Import TCW/generate TSWs • Import PSAT/UAV • Perform Recurrent Scheduling • Utilize EPS node operations • Perform GCMRs • Request reports and database queries

  35. Acceptance Test Results - Server/System(6 of 8) • Performance/Loading Test (cont’d) • Data Collected • Server Performance Logs • Eldorado – OPS Data Server – (241/1720-241/2350) • Roadrunner – OPS Open Server – (241/0000-242/0000) • Fury – Ops Closed Server- (241/0000-242/0000) • Client Alert Logs / Tester’s notes • Results: Passed • No node/resource failures noted • Over 900 requests/interactions with ANCC at one time (~200 normal ops) • Some delay noted at Client for window refresh (e.g. Active Summary) • Alert logs indicate no latency in server processing • Interactions between SvE and SNIF < 2 seconds • Interactions between SvE and DSDM < 5 seconds • Available MEM at pre-test ~30%. 50 simultaneous Client sessions, added ~2%. Assuming 250 simultaneous Client session, add ~ 10%, then available MEM > 60%.

  36. Acceptance Test Results - Server/System(7 of 8) • Performance/Loading Test (cont’d) • Server Performance Log Data

  37. Acceptance Test Results - Server/System(8 of 8) • Server/System Summary • Remaining Open IDRS – all related to HA • IDR #26487 - Unexpected resources moving between nodes • IDR #26489 - Explore modification to configuration to return node to standby state vice last state • IDR #26488 - Failed counts not incrementing • Possible documented Linux HA BUG

  38. Acceptance Test Results MOC Client Functionality O&M Client Functionality Server/System Testing IT Security Controls

  39. Acceptance Test Results – Security Controls • 100 Security Controls recommended: • 66 Implementation Acceptable • 4 Certification and Accreditation • 7 Not Applicable to SNAS • 9 Documentation needed • 13 Further verification required • 1 Waiver request planned

  40. Acceptance Test Results - Summary • 66/67 Test Cases executed 1 Blocked - Bulk schedules to DAS 54 Passes 12 Partial Successes • IDRs remaining Open - 13 3 - related to SNAS/NCCDS - (TSW, RR) 6 - related to SNAS/DAS 1 - general Client operations - dialog box not closing 3 - related to HA • IDRs Closed - 34

  41. WSC IT Security Evaluation Rich Pearce

  42. WSC IT Security Evaluation (1 of 6) • IT Security Acceptance Testing is centered on availability of approve documentation, the draft IT Security Plan, and implementation of system-specific security controls (WSC-wide and NASA common controls not considered for testing). • Required NASA Approved documentation • Final Risk Assessment, IT Security Plan, Contingency Plan • Account Management • Server accounts • Client accounts • System/Audit Logging & Review • Disaster Recovery • Software • Oracle database

  43. WSC IT Security Evaluation (2 of 6) • Approved IT Security Documentation • Still incomplete • Draft documents only • Final Risk Assessment needed based on actual implementation • Recommended changes to IT Security Plan – still awaited • Contingency Plan can piggy back on Appendix C of WSC-PLN-0088 (WSC MEI IT Security Plan) – Need for separate plan up to NASA. • No final security verification matrix or test plan • WSC IT Security assessment - Partially complete

  44. WSC IT Security Evaluation (3 of 6) • Account Management • Server Account Management • Server accounts created and managed based on approved account request (Form WSC-0325) • WSC SA responsible for creating and configuring accounts based on the approved form • Server accounts will be restricted to System Admins (SA), Database Admins (DBAs), ADPE Techs and IT Security • MOC and O&M Client Account Management • No provision for account requests. MOC User Guide requires list of users and data required • Completed/signed SNAS Rules of Behavior will be required of all users • Local O&M Client accounts (for WSC O&M personnel only) will follow MOC requirements • OPS DBA is responsible for MOC and O&M Client account management through O&M tools and procedures • WSC IT Security assessment - PASS • Local procedure needed for Server account management or change to existing LOP, as necessary

  45. WSC IT Security Evaluation (4 of 6) • System/Audit Logging and Review • Audit logging still not configured for security requirements • Server application log containing audit data (violates separation of duties requirements) • Designated “Audit” log empty • No tools for syslog or audit data review (dependence on System Messages) • Unable to test alerting for audit process failure (could not stop audit process) • WSC IT Security assessment - Fail • Need to finalize syslog and audit data locations and ensure only IT Security/Root-level access • Need tools for audit data extraction and syslog/audit log backup to removable media

  46. WSC IT Security Evaluation (5 of 6) • Disaster Recovery • No procedure/plan for system/database backups including off-site storage • SNAS software disaster recovery procedure • WSC LOP Volume IV - Book 2, LOP #32 “Software Configuration Management Disaster Control” provides for SW recovery • Five software patches (Client/Server/DB) have been delivered during AT - GSFC remote access for software delivery exercised • Oracle database recovery • EIF instance – auto backup is functioning • Open IDR #26368 – repeat for OPS instance • EIF shutdown and database recovery successfully exercised • Open IDR #26433 – repeat for Ops instance • Procedure to be documented • Procedure exercised on OPS instance • WSC IT Security assessment – Partially complete • Document procedures for database and system backup – Close IDRs #26368 and #26433

  47. WSC IT Security Evaluation (6 of 6) • IT Security Summary/Action Plan • Much remains for developer and WSC • Correction of issues presented – especially regarding auditing • Create tools/procedures to assist in System Admin/IT Security work • Finalize required documents • Final system check, including updated full system vulnerability scan by WSC • Those controls in Security Plan that cannot be met/risk accepted need a POA&M for correction. • Overall Readiness Assessment – Not Ready • Primarily due to audit issues • Need for finalized documentation • Need for system/database backup procedures

  48. SNAS Training David Warren

  49. Customer MOC Client • Held Customer Interface Meeting #8 on July 30th with training for • Installation of Client and Certificates • Client property configuration • File System Setup for External Processing System • Mission Manager responsibilities • Spacecraft characteristics and default scheduling parameters • Mission account setup and role assignments • Generating NCC State Vector and Schedule Requests • Overview of Reporting and Graphical Timeline • Mission representatives in attendance • HSF, HST, SPTR-2, NPOESS, SWIFT, IBEX, GLORY, GALEX, GPM, TRMM, and WSC and JSC engineering • Several Network Operation Managers supporting multiple missions • Next session to cover DAS scheduling, Graphical Timeline usage, and External Processing System in September • Tech Ops received MOC Client training at GCP • JSC Schedulers & HSF NOMs participated in WSC training

  50. WSC O&M • Held session on August 26th with training for • Server Architecture and Components • Server Configuration and Operation • High Availability • O&M Client • 2nd Session week of 9/15/2008 to cover • Database Design and Management • Digital Certificate Management • System Administration Procedures • Problem Reporting and Tracking • Troubleshooting Procedures • WSC representatives in attendance • Sys Admin, Oracle DB Admin, O&M DB Admin, TO&A, Ops Trainer • WSC Oracle DBA received Oracle Backup/Recovery Training

More Related