1 / 69

NSAP Technology Refresh NTR Transition Readiness Review TRR and Operational Readiness Review ORR 3-23-06

2. 3/9/2012. Agenda. Introduction and Project Overview (Page 2)V. StewartTesting (Page 15)R. SlikoImplementation

swaantje
Download Presentation

NSAP Technology Refresh NTR Transition Readiness Review TRR and Operational Readiness Review ORR 3-23-06

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


    1. 1 3/9/2012 NSAP Technology Refresh (NTR) Transition Readiness Review (TRR) and Operational Readiness Review (ORR) 3-23-06 Vicki Stewart NTR Project Manager Scott Douglas NTR Project Engineer

    2. 2 3/9/2012 Agenda Introduction and Project Overview (Page 2) V. Stewart Testing (Page 15) R. Sliko Implementation & Transition Approach (Page 25) M. Mascari Transition Risks and Mitigations (Page 34) V. Stewart Network Security (Page 44) V. Stewart Network Operations, Maintenance V. Stewart and Sustaining Engineering (Page 47) Schedule (Page 61) V. Stewart Conclusion and Recommendation (Page 69) V. Stewart

    3. 3 3/9/2012 Introduction and Project Overview V.Stewart

    4. 4 3/9/2012 Introduction TRR/ORR Purpose Demonstrate NISN is ready to proceed with the NTR transition. Demonstrate Ops personnel are trained and prepared to support operations. Identify activity milestones. Obtain senior level management buy in to move forward with transition.

    5. 5 3/9/2012 NTR Project Overview Goal To implement a reliable, cost effective network capable of meeting NASA mission requirements today as well as in the future. Objectives Improve network reliability. Replace non-maintainable systems. Optimizing wide area network connectivity. Complete installation and transition by December 31, 2006. Provide mission services which meet performance parameters specified in the NISN Services Document. Minimize disruption of service to NASA projects and missions during transition to the new network. Optimize cost effectiveness by aggregating links to bigger pipes.

    6. 6 3/9/2012 NTR Project Overview (Cont.)

    7. 7 3/9/2012 NTR Project Overview (Cont.)

    8. 8 3/9/2012 NTR Project Overview (Cont.)

    9. 9 3/9/2012 NTR Project Overview (Cont.)

    10. 10 3/9/2012 NTR Project Overview (Cont.) NTR Backbone Private MPLS network utilizing RAD Megaplex-2100s, RAD IP-Muxes and Juniper Routers Core backbone consisting of OC3s, DS-3s, ML-PPP T1s and T1s, capable of OC-48 Backbone will be running OSPF as the interior routing protocol MPLS fast reroute around failures (< 50 to 100 ms) NASA traffic will be mapped directly into a Layer 2 MPLS label switched path and switched across the NTR backbone Eliminates the need to exchange IP information Small sites will be regionally aggregated into backbone sites Reduces backhaul costs and network complexity Supports Quality of Service Supports ability to police and shape traffic

    11. 11 3/9/2012 NTR Project Overview (Cont.) NTR Backbone (Cont.) All Satellite, DACS and T1 circuits will be terminated utilizing the MP-2100 with a MainLink 2T1 card or a RAD IP-Mux. The only satellite T1 links to remain will be JSC-DFRC and GSFC-WLPS Voice, Low and High Serial data will be terminated on the MP-2100 with the appropriate line card Ethernet Interfaces will be terminated into the Juniper 10/100 Ethernet card Supports mapping of PPP, ML-PPP, Frame Relay, ATM and HDLC into Layer 2 label switched path

    12. 12 3/9/2012 NTR Project Overview (Cont.) Large Sites (Two MPLS Routers & Channel Banks) Medium Sites (One MPLS Router & one or more Channel Banks) Small Sites (One or more Channel Banks) T1- Interface Sites (T1 CSU/DSU)

    13. 13 3/9/2012 NTR Project Overview (Cont.) AT&T Critical Design Review Jan 05 NTR Project Management Plan Baseline Feb 05 AT&T and Juniper Proof of Concept Demonstration April 05 NTR Training Workshop Aug 05 NTR Customer Design Document (CDD) Signed Oct 05 AT&T installations (GSFC and SUIT) Oct 05 NTR Test Plan Baseline Nov 05 NTR Transition Plan Baseline Nov 05 AT&T installations (JSC and PSU) Nov 05 AT&T installations (WSC, JPL, ASU, GDS, NHQ, & DFRC) Dec 05 First NTR Pilot Transition (ASU) Dec 05

    14. 14 3/9/2012 NTR Project Overview (Cont.) AT&T submitted security check list for AT&T portal Jan 06 AT&T installations (STAN, CAP, and BASP) Jan 06 Second NTR Pilot Transition (CAP) Jan 06 Successful Data flow from BASP to JPL via WSC Jan 06 Third NTR Pilot Transition (PSU) Feb 06 AT&T installations (LITT, U of CO, STAN, SFAL, SBRY, GDS) Feb 06 AT&T installations (LARC, BERK, LANM, VAFB) Mar 06 Fourth NTR Pilot Transition (SBRY) Mar 06

    15. 15 3/9/2012 Testing R. Sliko

    16. 16 3/9/2012 Testing Five Configurations were evaluated AT&T/Juniper Proof of Concept RAD equipment, Juniper Routers and Nortel Routers tested in Juniper Lab and GSFC IPNOC lab facilities. Standalone RAD timing RAD equipment contained within Room S181 at GSFC with the network side looped. Pilot 1 (T1) RAD equipment at GSFC, MSFC, JSC and KSC connected via T1 circuits. Pilot 2 (Internal MPLS) RAD equipment and Juniper routers in a test network contained within Room S181 at GSFC. Operational Network RAD equipment and Juniper routers on the actual operational NTR MPLS IP Network

    17. 17 3/9/2012 Testing (Cont.) AT&T and Juniper Proof of Concept Purpose Demonstrate the ability to pass voice and data traffic over an IP MPLS backbone. Demonstrate the ability for the designed architecture to provide a layer two circuit so two NASA Nortel routers will be able to communicate directly without sharing IP information with the AT&T backbone. Test the functionality of the equipment. Test hardware specifications Test failure scenarios of the equipment and the resiliency of the network when failure scenarios occur. Results All issues and concerns have been resolved except the need to implement Quality of Service (QoS) QoS is addressed in the Risks section of this presentation

    18. 18 3/9/2012 Testing (Cont.) Standalone Testing Purpose Evaluated RAD internal and external timing options. Results RAD channel bank can support external timing Entire RAD channel bank is timed from the designated DTE interface. All other channels on the RAD are required to conform to this timing. RAD (DTE Mode) output timing (TT) is not locked to RAD output data (SD) Requires site equipment to either ignore the RAD output timing or use the site input timing when RAD is configured as a DTE.

    19. 19 3/9/2012 Testing (Cont.) Pilot 01 (T1) Data Tests Purpose Demonstrate the ability to interface RAD equipment to site legacy equipment and pass data traffic over a T1 circuit between GSFC/JSC, GSFC/MSFC, and GSFC/KSC Results GSFC/JSC Throughput Commands at 224kb Low Speed 2.4kbps, 9.6kbps BERT GSFC/MSFC SCD (internal BERT) to ASN router to RAD Low Speed 2.4kbps, 9.6kbps Data Over Voice Freddy 600 to 201-C Modem to RAD Voice GSFC/KSC Fireberd to KSC TMS to KSC RAD (DTE) to GSFC RAD (DCE) to Fireberd Used existing NSAP resources (satellite T1 between GSFC and KSC)

    20. 20 3/9/2012 Testing (Cont.) Pilot 01 (T1) Voice Tests Purpose Interface RAD equipment to site legacy voice equipment and pass voice traffic over a T1 circuit between GSFC/JSC and GSFC/MSFC Sage Mean Opinion Score (MOS) Test Roundtrip Delay Test Modified Rhyme Test Shuttle Training Aircraft test Piggybacked on scheduled Shuttle Training Aircraft activity to flow voice from KSC to JSC and JSC Ops to evaluate Results 8kbps MOS Score (>3.5) compares with existing NSAP 24kbps service. 8kbps noise level is less than the NSAP 24kbps or 32kbps service. 8kbps bandwidth is 5.7 to 12.8% less than the NSAP 32kbps or 24kbps. RAD 8kbps bandwidth (85.2-87.1%) exceeds the 77% theoretical bandwidth of a G.729A Codec. The worst case 8kbps roundtrip delay of 206.8mSec (GSFC/MSFC) is longer than the GSFC/MSFC NSAP 32kbps roundtrip delay at 56.8mSec. Delay does not exceed the recommended ITU-T G.114 specified maximum round-trip delay of 300mSec. Modified Rhyme Tests on RAD 8kbps scored from 88% to 90%. Results above 85% exceed the acceptable threshold. 8kbps STA voice test results results (over satellite T1): Acceptable

    21. 21 3/9/2012 Testing (Cont.) Pilot 2 (Internal MPLS) Test Purpose Evaluate RAD’s ability to pass data traffic over an IP MPLS backbone. Results No errors for 15 minute duration of tests at 56kbps, 64kbps, 512kbps, 1024kbps, 1472kbps, 1536kbps, 1920kbps Round trip delay measured from 11.6mSec to 47.5mSec (Rate dependent)

    22. 22 3/9/2012 Testing (Cont.) Operational MPLS Network Tests Purpose Evaluate RAD Equipment ability to pass data and voice traffic over the operational MPLS backbone. Evaluate the operational MPLS Backbone ability to pass IP data traffic Results GSFC/JPL RAD Data Tests 256kbps test ran overnight no errors, 73.5mSec round trip delay. 56kbps test ran overnight no errors, 73.8mSec round trip delay 9.6kbps test, quick look-no errors, 104.3mSec round trip delay GSFC/JPL Voice Tests 8kbps, SAGE MOS score greater than 3.63, Roundtrip Delay ~246mSec Tone test, less than .8dBm loss GSFC/WSC Voice Test 64kbps, SAGE MOS score above 4.25, Roundtrip Delay ~90mSec 8kbps, SAGE MOS score above 3.68, Roundtrip Delay ~233mSec 8kbps, Tone test, less than .6dBm loss

    23. 23 3/9/2012 Testing (Cont.) Operational MPLS Network Tests (Cont) Results (Cont) GSFC/WSC IP Tests Quick look test, Initially passed 60 Mbps. Max frame tested was 4000 Bytes. During test, the 60Mbps primary path over the dual DS-3s to WSC failed over to the backup path. Backup path is planned to be on the OC-3 to JSC. This path does not yet exist. The 60Mbps flow was rerouted over the single DS3 from GSFC to JPL then to WSC. This dropped the throughput to 40Mbps and also caused the 56k and 256k GSFC to JPL tests to drop as well. AT&T has configured the Quality of Service of the 60 Mbps data flow so that the 60Mbps does not attempt to reroute over the GSFC to JPL DS3. A failover test is planned for the week of March 20 60Mbps test 2 (EOS), Frame size 4380 bytes, ran almost 17 hours, lost 1 frame, round trip time 53mSec 9.1 Mbps (IONet), Frame size 1518 bytes, Round trip time: 52ms, Ran 239 hours, no Frames lost

    24. 24 3/9/2012 Testing (Cont.) Questions?

    25. 25 3/9/2012 Implementation & Transition Approach M. Mascari

    26. 26 3/9/2012 Implementation & Transition Approach General In most cases, NTR will be implemented in parallel with the existing Mission Critical Network to allow for manual fail back in the event that interruptions to NTR services become operationally unacceptable during the transition period. The Transition Plan identifies every services/channels to be transitioned and their fail back capability The NTR project will coordinate with remote end sites for manual fail back. Small sites (ASU, CAP, PFLT, PSU, APL, LANM) may have limited ability to fail back To the extent possible, the existing channel level service identifiers will be retained at the sites’ request to minimize changes to existing site documentation and labeling.

    27. 27 3/9/2012 Implementation & Transition Approach (Cont.) Events Carrier circuit and equipment installation Quality assurance tests Node acceptance tests Site concurrence to proceed with transition Transition and service acceptance tests Quality Assurance Testing Validate the integrity of the carrier installed facility. Conducted by GSFC Tech Control or designated site. Must be conducted within 72 hours after Carrier notifies Tech Control that circuit is available for testing. Consists of one 24 hour BERT that must meet BER < 1E-07 and 0 frame slips. Hot Cut Sites have 1 hour test or Service Acceptance test only

    28. 28 3/9/2012 Implementation & Transition Approach (Cont.) Node Acceptance Testing Validates the ability to pass data and/or voice via the NTR equipment and circuits. For each site, test suite components have been identified based on the services in use at that site. Test Suite Components include: Ethernet Service Test Serial Data Test Data over Voice Test Full T1 Data Test Analog Voice Services Test T1 Voice Services Test Criteria Serial Data Services (BERT) BER <10E-07 0 frame slips over 24 hours Voice Services (Sage 935AT) SMOS score >3.5 Delay < 300 ms Ethernet Services (Sunset Test set) Packet losses < 0.001% BER < 10E-7 Round trip time (Juniper Ethernet port to Juniper Ethernet port) < 100 ms

    29. 29 3/9/2012 Implementation & Transition Approach (Cont.) Transition Process NNSG will issue NTR Service Advisory Messages (NSAM) NSAM Messages will be issued for all service transitions NSAM messages will identify each service identifier, site connection information and transition date NSAM message will be issued at least one week in advance of target transition date. Service Acceptance Testing Verify the configuration of each carrier-installed and configured item prior to flowing realtime data across the NTR equipment and circuits. Test Suite component based on service type Conducted for each service prior to connecting to user equipment NISN Operations Manager and end user Project Manager sign NASCOM Mission Operations Acceptance Form – NTR Transition indicating service is accepted

    30. 30 3/9/2012 Implementation & Transition Approach (Cont.)

    31. 31 3/9/2012 Implementation & Transition Approach (Cont.) Voice Transition Approximately 24 voice services scheduled per day at a given site (KSC-12/day due to site limitations), Verification process Voice services without signaling: Tone test, S-MOS and voice check Voice services with signaling: Tone test, S-MOS, voice check and signaling test Serial Data and Data over Voice Transition Approximately 2 services scheduled per day at a given site Verification process Bit error rate check (Loopback) Serial data test duration: 15 minutes will allow for 99% confidence level at all rates 56 kbps and above, allowing for up to 2 errors. Round trip time will be recorded for all serial data tests as a baseline End-to-end dataflow Ethernet Transition Approximately 1 Ethernet service scheduled per day Verification process Ping tests data flow and monitoring.

    32. 32 3/9/2012 Implementation & Transition Approach (Cont.) Post Transition Activities A circuit disconnect order may be issued 7 days following the successful transition of all services from a given NSAP circuit (T1) and will utilize the NSR process. A limited number of circuits will remain in place for an operations confidence period Critical Shuttle services may exceed 30 days parallel operations

    33. 33 3/9/2012 Implementation & Transition Approach (Cont.) Questions?

    34. 34 3/9/2012 Transition Risks and Mitigations V. Stewart

    35. 35 3/9/2012 Transition Risks and Mitigations (Cont.) Risk #1: Site Not Ready for NTR Installation Description Major sites not ready to support NTR installation JSC – Originally scheduled for installation in November. Insufficient SBC facilities into the center, requiring a build. NTR equipment installed on site during installation window. Expected completion date for build through Apollo Central Office is 4/21/06. Expected completion date for build through Seabrook Central Office has slipped again, now set for 6/27/06. MSFC – Originally scheduled for November. At the request of the site, the NTR demarc was moved from Bldg. 4207 to Bldg. 4663. In order for Bell South to provide diversity into Bldg. 4663, an Accuring had to be built. After NTR CDD signed, AT&T entered into a contract with Bell South to start the build. Expected completion date late June. Mitigations Work closely with site to resolve issues, schedule installation and transition activities.

    36. 36 3/9/2012 Transition Risks and Mitigations (Cont.) Risk #2: Site not able to support transition as scheduled Description Sites not ready to support installation because of contractual issues KSC – Originally scheduled for December. KSC contractor Indyne required money was put on their contract. Contractual issue resolved in January and work began. Due to delay in submitting work orders, power will not be available for the NTR equipment until late June. Poker Flat – Facility is a commercial facility and contractor requires funding to support NTR. Spoke with Roger Clayson/GN on 3/20/06, and requested funding for 120/hr to support installation and hot cut of NTR services at Poker Flat. Schriever AFB – With the closure of OAFB, Shuttle support will be moved to Schriever AFB. OAFB NTR orders have been canceled. SAFB is reporting that they cannot support NTR without funding. GSFC/Code 450 working this issue directly with Air Force. Mitigations Work closely with site to resolve issues, schedule installation and transition activities.

    37. 37 3/9/2012 Transition Risks and Mitigations Risk #3: Project Cost Description Total project cost coming in higher than forecast cost. Sites have requested parallel service for an extended period. Mitigations Analyze bandwidth to see if anything can be turned down. Reduce core cost by moving cost to user

    38. 38 3/9/2012 Transition Risks and Mitigations (Cont.) Risk #4: Not Implementing Quality of Service Description Lower priority traffic such as JUNOScope in-band causes errors on network data flows Mitigations Quality of Service to be implemented. AT&T and NISN met on 3/16/06 to identify traffic and its appropriate QOS prioritization level. AT&T to implement on GSFC-Suitland router interfaces to validate QOS plan prior to full implementation. Reclassify JUNOscope traffic to lowest QOS priority.

    39. 39 3/9/2012 Transition Risks and Mitigations (Cont.) Risk #5: Failover Description Automatic failovers are significantly more complex than in the NSAP network. Mitigations AT&T engineering analysis, modeling and lab testing NISN test bed

    40. 40 3/9/2012 Transition Risks and Mitigations (Cont.) Risk #6: Diversity and Special Routing Description Some diversity and special routing requirements as implemented in NSAP were not clear. Mitigations Develop detailed requirements document describing service level requirements. Review AT&T implementation to ensure these requirements are met.

    41. 41 3/9/2012 Transition Risks and Mitigations (Cont.) Risk #7: NOMC Network Monitoring with RADview Description RADview takes a long time to update screen information Mitigations AT&T recommends the installation of RADview PC Client. Target completion date, week of 4/17/06.

    42. 42 3/9/2012 Transition Risks and Mitigations (Cont.)

    43. 43 3/9/2012 Transition Risks and Mitigations (Cont.) Questions?

    44. 44 3/9/2012 Network Security V. Stewart

    45. 45 3/9/2012 Network Security Completed IONet security checklist. NISN security meeting with AT&T held on 3/16/06. NTR will be included in the Mission Network Security plan. AT&T is providing their AGSEMC Security Plan and high level security policies for PoP facilities. FISMA requirements for NTR are under discussion.

    46. 46 3/9/2012 Network Security(Cont.) Questions?

    47. 47 3/9/2012 Network Operations, Maintenance and Sustaining Engineering V. Stewart

    48. 48 3/9/2012 Network Operations Overview AT&T is responsible for the management, operations and fault isolation to the carrier Service Delivery Point (SDP) of the Network AT&T Government Solutions Enterprise Management Center (AGSEMC) in Oakton, VA Back-up Facility (AT&T Labs - Middletown, NJ) Network node configuration, service configuration, and service re-route in order to manage end to end transmission paths down to the channel level NISN organizations responsible for Network Operations are: COMMGR Tech Control IPNOC NISN Organizations responsible for Network Configuration Management are: NNSG Network Requirements & Analysis (N&RA)

    49. 49 3/9/2012 Network Operations (Cont.) External customer reporting and coordination processes remain the same NISN Communications Manager (COMMGR) remains the central point of contact for problem reporting, coordination, and status for real-time issues NISN Network Scheduling Group (NNSG) provides scheduling and customer notification for planned activities and outages Network Requirements & Analysis (NR&A) will track service metrics and discrepancies for Mission Communications Services which are on NTR Trouble reports and day-to-day operations will comply to NASCOP Changes, if any, to existing capabilities are noted where appropriate

    50. 50 3/9/2012 Network Operations (Cont.) COMMGR COMMGR will use existing processes to support NTR Mission Outage Notification System (MONS) Problem Management and Dispatch System (PMDS) Trouble Tickets The same service identifiers are utilized for NSAP and NTR. COMMGR will confirm if a problem report relates to the NSAP or new NTR service during transition COMMGR have been provided NTR material to review COMMGR access to AT&T NTR Portal will support trouble ticketing COMMGR Portal scheduled for installation COMMGR will rely on Tech Control until access and training is completed. No operational impact expected.

    51. 51 3/9/2012 Network Operations (Cont.) Tech Control Training on both RADVIEW and JUNOSCOPE management systems used to monitor the NTR network thru the portal provided by AT&T and are prepared to support operations The scope of visibility and configuration control into the NTR network is less than the current level within the NSAP network. Technical Control personnel are training on the NTR test equipment provided. Will interface with AT&T and the Sites to troubleshoot NTR circuit and equipment issues. In junction with AT&T will provide diagnostic test support as well as instruction on the use of the NTR tools provided to the Sites. Technical Control personnel will use the circuit directory developed by the NNSG to cross reference Site trouble reports (using the existing channel level service identifiers) to the actual NTR circuit/hardware interfaces for fault isolation.

    52. 52 3/9/2012 Network Operations (Cont.) IPNOC IPNOC will use Junoscope via the AT&T portal to collect statistics and NTR device information Collected data will be used to enhance the IPNOC management of the NISN mission routed data networks The scope of visibility into the NTR network is less than the current level within the NSAP network. Portal will be integrated into IPNOC upon EC approval Portal currently connected in the Tech Control area only Training in progress

    53. 53 3/9/2012 Network Maintenance AT&T is responsible for: Providing dedicated technicians and an 800 direct hot line. Assisting in troubleshooting when or if required via telephone or onsite via established escalation procedures Performing system-level maintenance actions and troubleshooting failures to the Line Replaceable Unit (LRU) level. Operating a depot-level logistics facility for repair, replacement, and exchange of failed LRUs Providing a predetermined set of spares for each site to maintain the required service levels. Replacing failed units within a specified time period to replenish the local spares.

    54. 54 3/9/2012 Network Maintenance (Cont.) NTR Problem Escalation

    55. 55 3/9/2012 Network Maintenance (Cont.) Host Center Support Replacing failed LRUs with a local spare at the direction of AT&T and GSFC Tech Control Returning failed LRUs to AT&T for repair/replacement. Troubleshooting support. Diagnostic test support. NTR Tools provided to sites T-1 Patch Panels Loopback plugs Sunset Ethernet/T1 test set GSFC Tech Control will interface with AT&T and the Host Centers to troubleshoot and repair the NTR equipment.

    56. 56 3/9/2012 Network Sustaining Engineering AT&T will provide sustaining engineering which includes: Network sizing and capacity management Bandwidth utilization analysis of NTR backbone Time slot availability Routing optimization of services across NTR backbone Interconnect optimization Circuit loading parameter analysis NTR service diversity analysis Network planning oversight Track and provide network performance data (historical) Spare Line Replaceable Unit (LRU) management and oversight Analyze and review LRU levels for each site Hardware, firmware, and software upgrades/modifications Provide/install through the end of the contract New service requirements Responsible for design, implementation, testing, configuration analysis and documentation of network service through existing NSR process Network security management

    57. 57 3/9/2012 Network Configuration Management AT&T is responsible for developing and maintaining the following documentation and databases: Site Surveys Design, Cost and Schedule (DCS) Packages Implementation Plan Technical System Documentation (O&M manuals, users guides, etc.) Training Materials Security Documentation As-installed documentation (Rack elevations, cabling, etc) NTR equipment configurations Site topology drawings

    58. 58 3/9/2012 Network Configuration Management (Cont.) NISN Network Scheduling Group (NSG) Mission Network database transitioned to NISN Integrated Information System (NIIS) and is fully operational. NIIS includes documentation for services (NSR's, SO's, PSAM's, FSAM's, AT&T Configuration Worksheets, etc), Site points of contact, and circuit drawings. Network Requirements & Analysis (NR&A) Carrier Service Metrics will be tracked the same way as currently utilized from Tech Control Tickets (what used to be a Circuit will now be a Service Identifier) When more experience is gained with the NTR tools and services, the metrics process may be revisited to consider optimizing to better meet the technology and NISN reporting requirements.

    59. 59 3/9/2012 Network Configuration Management (Cont.) NISN is responsible for providing the following documentation: Proof of Concept Test Plan Acceptance Test Plan Transition Plan Updates to NASCOP to include NTR information

    60. 60 3/9/2012 Network Operations, Maintenance and Sustaining Engineering (Cont.) Questions?

    61. 61 3/9/2012 Schedule V. Stewart

    62. 62 3/9/2012 Schedule

    63. 63 3/9/2012 Schedule (Cont.)

    64. 64 3/9/2012 Schedule (Cont.)

    65. 65 3/9/2012 Schedule (Cont.)

    66. 66 3/9/2012 Schedule (Cont.)

    67. 67 3/9/2012 Schedule (Cont.)

    68. 68 3/9/2012 Schedule (Cont.) Questions?

    69. 69 3/9/2012 Conclusion and Recommendation V. Stewart

    70. 70 3/9/2012 Conclusion and Recommendation Conclusion NTR is ready to be transitioned as an operational network Recommendation NISN Management approve the NTR project’s proceeding with transition of services from the legacy network onto the new technology refreshed network

More Related