the collaborative radar acquisition field test craft next steps n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
The Collaborative Radar Acquisition Field Test (CRAFT): Next Steps PowerPoint Presentation
Download Presentation
The Collaborative Radar Acquisition Field Test (CRAFT): Next Steps

Loading in 2 Seconds...

play fullscreen
1 / 50

The Collaborative Radar Acquisition Field Test (CRAFT): Next Steps - PowerPoint PPT Presentation


  • 61 Views
  • Uploaded on

The Collaborative Radar Acquisition Field Test (CRAFT): Next Steps. Kelvin K. Droegemeier University of Oklahoma 2 nd Level II Stakeholders Workshop 26-27 September 2002 Norman, Oklahoma. NCDC. The Issues Before Us.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'The Collaborative Radar Acquisition Field Test (CRAFT): Next Steps' - dustin-gibson


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
the collaborative radar acquisition field test craft next steps

The Collaborative Radar Acquisition Field Test (CRAFT): Next Steps

Kelvin K. DroegemeierUniversity of Oklahoma2nd Level II Stakeholders Workshop26-27 September 2002Norman, Oklahoma

NCDC

the issues before us
The Issues Before Us
  • Grant funding for CRAFT communication links and personnel is nearly exhausted (data will stop flowing from CAPS sometime in November)
  • The private and academic sectors are finding value in real time Level II data
  • A real time Level II NWS collection system
    • is likely more than 1 year away
    • may not provide the latencies and reliability needed by the private sector for the short term
    • may be perfectly suited for meeting all needs in the longer term
  • What options exist?
  • How can we maximize the benefits to all stakeholders: Government, industry, academia?
options
Options
  • A wide range of potential options exists, all of which require Government approval
    • Shut CRAFT down and wait for the NWS system
      • Timeline not yet defined
      • Not clear the NWS system will meet non-Government user needs
      • We likely won’t know until the system is in place
      • If it does meet all user needs, we’re set
      • If it does not, no alternative will exist (might take months to create)
    • Continue the present collaborative system (58 radars) or expand to all 120 NWS radars (lots of sub-options)
    • Create a stand-alone system that includes all 120 NWS WSR-88D radars, serves as a back-up to whatever the NWS implements, and has 7x24 support, improved reliability, etc
  • Must consider administration of system (later in talk)
  • The ideal perhaps is a partnership among all groups, with “partnership” defined many ways
suppose the nws deploys and manages its own level ii distribution system a very sensible approach
Suppose the NWS Deploys and Manages its Own Level II Distribution System (a very sensible approach)
logical network topology1
Logical Network Topology

At the moment, OU

is the only server –

Single points of failure

(server and line from

each radar)

LDM Server

OU

logical network topology2
Logical Network Topology

Universities

NOAA Laboratories

NOAA Joint InstitutesNCAR/UCAR

MIT/Lincoln Lab

NWS Regional HQ,

NCEP Centers, RFCs

LDM Server

logical network topology3
Logical Network Topology

Universities

NOAA Laboratories

NOAA Joint InstitutesNCAR/UCAR

MIT/Lincoln Lab

NWS Regional HQ,

NCEP Centers, RFCs

LDM Server

These already exist!!

logical network topology4
Logical Network Topology

Commodity Internet

via phone linesor commodityInternet

LDM Server

Abilene Backbone

(no commercial traffic)

slide12

LDM Server

LDM Server

LDM Server

LDM Server

LDM Server

LDM Server

slide13

LDM Server

LDM Server

Abilene Network

LDM Server

LDM Server

LDM Server

LDM Server

slide14

Each LDM “Hub Site” Carries all 88D

data on Abilene “bus”-- redundancy

LDM Server

LDM Server

Abilene Network

LDM Server

LDM Server

LDM Server

LDM Server

slide15

HUB

HUB

HUB

HUB

HUB

HUB

HUB

slide16

LDM Server

LDM Server

Abilene Network

LDM Server

LDM Server

LDM Server

LDM Server

slide17

Commodity Internet

Commodity Internet

LDM Server

LDM Server

Abilene Network

Commodity Internet

Commodity Internet

LDM Server

LDM Server

LDM Server

LDM Server

Commodity Internet

Commodity Internet

slide18

Commodity Internet

Commodity Internet

LDM Server

LDM Server

Abilene Network

Commodity Internet

Commodity Internet

LDM Server

LDM Server

LDM Server

LDM Server

Commodity Internet

Commodity Internet

slide19

Commodity Internet

Commodity Internet

LDM Server

LDM Server

Abilene Network

Commodity Internet

Commodity Internet

LDM Server

LDM Server

LDM Server

LDM Server

Commodity Internet

Commodity Internet

slide20

Commodity Internet

Commodity Internet

LDM Server

LDM Server

Abilene Network

Commodity Internet

Commodity Internet

LDM Server

LDM Server

LDM Server

LDM Server

Commodity Internet

Commodity Internet

slide21

Customers

Dedicated or

Commodity

LDM Server

PrivateCompany

Commodity Internet

Commodity Internet

LDM Server

LDM Server

Abilene Network

Commodity Internet

Commodity Internet

LDM Server

LDM Server

LDM Server

LDM Server

Commodity Internet

Commodity Internet

slide22

LDM Server

LDM Server

Abilene Network

LDM Server

LDM Server

LDM Server

LDM Server

features of this concept
Features of this Concept
  • NOAA runs its own operational ingest system but allows connections to the BDDS of each NWS radar
  • The CRAFT configuration
    • Is completely scalable to more nodes or radars
    • Is highly redundant (each major hub server contains all of the data)
    • Is highly reliable (loss of a major hub has minimal impact)
    • Leverages existing infrastructure
    • Links easily to other networks (e.g., AWIPS)
    • Has significant capacity for future growth (dual-pol, phased array)
    • Could have dual communication lines from each radar
    • Could serve as a backup system for the NWS
features of this concept1
Features of this Concept
  • Many variants exist
  • May require enhancements to LDM, e.g., multi-cast
  • Must consider support of LDM to the commercial sector
  • Key point is to create a national hierarchical distribution system along the lines of the current Unidata IDD
possible scenarios
Possible Scenarios
  • Scenario #1: Maintain the current system of 58 radars with OU as the single ingest node
    • Assumptions
      • Line charges paid by same groups as now, at the same rates
possible scenarios1
Possible Scenarios
  • Scenario #1: Maintain the current system of 58 radars with OU as the single ingest node
    • Assumptions
      • Line charges paid by same groups as now, at the same rates
        • 6 Sea Grant sites: $31K/year
        • 6 SRP sites $72K/year
        • 21 MIT sites $200K/year
        • 4 Florida sites $5K/year
        • 10 OU sites $80K/year
        • 11 other sites FSL, NASA, GTRI, SLC, RAP, SEA (no cost estimates available)
      • Total leveraging is ~ $450,000 per year
possible scenarios2
Possible Scenarios
  • Scenario #1: Maintain the current system of 58 radars with OU as the single ingest node
    • Assumptions
      • Line charges paid by same groups as now, at the same rates
      • No significant s/w development or 7x24 QOS
      • Maintain current OU staff levels (C. Sinclair at 1.0 FTE and S. Hill at 0.5 FTE)
      • $20K for h/w replacement, $10K for travel (per year)
      • $1K for supplies (per year)
      • KD, DJ, DE at 1 month each (1.0 FTE) (per year)
    • Yearly cost: $355,000 (could be reduced by shifting some existing lines to cheaper alternatives)
    • Advantages
      • No additional h/w costs (above replacement)
      • Continue using a proven reliable infrastructure
possible scenarios3
Possible Scenarios
  • Disadvantages
    • Not all radars are included
    • Continue with heterogeneous communications infrastructure, latency problems
    • Relies on existing groups to continue paying their local costs
    • Little increase in QOS (i.e., no 7x24)
    • 56K lines will continue to fall behind in weather
    • Single ingest system at OU provides no redundancy
    • Reliance upon university for private sector mission-critical needs
    • No clear path to deal with data volume increase; however, this may not be critical if NWS system is available relatively soon
possible scenarios4
Possible Scenarios
  • Scenario #2: Same as Scenario #1, but add the remaining 64 NWS radars
    • Additional assumptions
      • New CAPS technical staff member ($40K/year) for QOS and other work
      • $100K in one-time costs for PCs
      • $200K for one-time line installation costs and routers
      • $50K in travel
      • $5K for supplies
      • $50K in h/w replacement costs and hot spares
      • 30 new lines cost average of current OU lines; rest cost $50/month based on DSL/cable modem
    • Year-1 cost: $1.3M (could be reduced by shifting some existing lines to cheaper alternatives)
    • Beyond Year-1: Estimate $900,000/year
possible scenarios5
Possible Scenarios
  • Advantages
    • No additional h/w costs (above replacement)
    • Continue using a proven reliable infrastructure
    • All 120 NWS radars available
    • Improved QOS via 2nd OU staff person
possible scenarios6
Possible Scenarios
  • Disadvantages
    • Not all radars are included
    • Continue with heterogeneous communications infrastructure, latency problems
    • Relies on existing groups to continue paying their local costs
    • Little increase in QOS (i.e., no 7x24)
    • 56K lines will continue to fall behind in weather
    • Single ingest system at OU provides no redundancy
    • Reliance upon university for private sector mission-critical needs
possible scenarios7
Possible Scenarios
  • Scenario #3: Same as Scenario #2, but add UCAR as a second Abilene ingest node
    • Additional assumptions
      • $100K in computer hardware at UCAR
      • One new UCAR technical staff member
    • Year-1 cost: $1.5M (could be reduced by shifting some existing lines to cheaper alternatives)
    • Beyond Year-1: Estimate $1.2M/year
    • Note: Could possibly add MIT/LL as third redundant node, but this has not been discussed with them
possible scenarios8
Possible Scenarios
  • Advantages
    • No additional h/w costs (above replacement)
    • Continue using a proven reliable infrastructure
    • All 120 NWS radars available
    • Improved QOS via 2nd OU staff person
    • Greatly improved redundancy, reliability, latencies
possible scenarios9
Possible Scenarios
  • Disadvantages
    • Not all radars are included
    • Continue with heterogeneous communications infrastructure, latency problems
    • Relies on existing groups to continue paying their local costs
    • Little increase in QOS (i.e., no 7x24)
    • 56K lines will continue to fall behind in weather
    • Single ingest system at OU provides no redundancy
    • Reliance upon university for private sector mission-critical needs (not clear that UCAR can provide needed QOS)
scenario summaries 1 3
Scenario Summaries (1-3)

* Leverages $450K/year paid by other organizations

** Could try and add MIT/LL as third node

possible scenarios10
Possible Scenarios
  • Scenario #4: Same as Scenario #3, but with a national telecommunications carrier providing uniform delivery service to the additional 64 radars only
    • Additional assumptions
      • AT&T line costs for 2-year contract for 64 additional radars is $850,000/year.
      • Mixture of T1, DSL
      • Note that these costs have not been negotiated and likely could be reduced substantially (might also be able to eliminate T1 lines)
      • Removes need for one-time installation charges and router costs
      • Still have the costs of the 64 new LDM PCs
    • Yearly cost: $2.1M (hope this could be brought down to $1.6 or $1.7M with tough negotiation)
possible scenarios11
Possible Scenarios
  • Advantages
    • No additional h/w costs (above replacement)
    • Continue using a proven reliable infrastructure
    • All 120 NWS radars available
    • Improved QOS via 2nd OU staff person
    • Greatly improved redundancy, reliability, latencies
    • Uniform networking for 64 radars
    • QOS should be much higher (AT&T rapid response)
possible scenarios12
Possible Scenarios
  • Disadvantages
    • Not all radars are included
    • PARTLY heterogeneous communications infrastructure, latency problems
    • Relies on existing groups to continue paying their local costs
    • Little increase in QOS (i.e., no 7x24)
    • 56K lines will continue to fall behind in weather
    • Single ingest system at OU provides no redundancy
    • Reliance upon university for private sector mission-critical needs
scenario summaries 1 4
Scenario Summaries (1-4)

* Leverages $450K/year paid by other organizations

** Could try and add MIT/LL as third node

possible scenarios13
Possible Scenarios
  • Scenario #5: Same as Scenario #4, but with a national telecommunications carrier providing uniform delivery service to all radars
    • Additional assumptions
      • AT&T line costs for 2-year contract for all radars is $1.4M/year.
      • Mixture of T1, DSL
      • Note that these costs have not been negotiated and likely could be reduced substantially (might also be able to eliminate T1 lines)
      • Removes need for one-time installation charges and router costs
      • Still have the costs of the 64 new LDM PCs
    • Yearly cost: $2.8M (hope this could be brought down to $2.2 or $2.3M with tough negotiation)
possible scenarios14
Possible Scenarios
  • Advantages
    • No additional h/w costs (above replacement)
    • Continue using a proven reliable infrastructure
    • All 120 NWS radars available
    • Improved QOS via 2nd OU staff person
    • Greatly improved redundancy, reliability, latencies
    • Uniform networking for ALL radars
    • QOS should be much higher (AT&T rapid response)
    • Increased bandwidth needs (e.g., dual-pol, new VCP, ¼ km by ½ degree resolution) could be handled by the telecomm carrier “automatically”
possible scenarios15
Possible Scenarios
  • Disadvantages
    • Not all radars are included
    • PARTLY heterogeneous communications infrastructure, latency problems
    • Relies on existing groups to continue paying their local costs
    • Little increase in QOS (i.e., no 7x24)
    • 56K lines will continue to fall behind in weather
    • Single ingest system at OU provides no redundancy
    • Reliance upon university for private sector mission-critical needs
scenario summaries 1 5
Scenario Summaries (1-5)

* Leverages $450K/year paid by other organizations

** Could try and add MIT/LL as third node

other scenarios
Other Scenarios
  • Scenario #6: Use NWS River Forecast Centers as points of aggregation
    • May make sense only if the NWS wishes to pursue a non-AWIPS collection strategy
    • The general CRAFT concept still could be applied
  • Scenario #7: Use the Planned NWS Distribution System
  • Scenario #8: Create a System Operated Entirely by the Private Sector (no university or UCAR involvement)
administrative structure
Administrative Structure
  • Points of Reference (for the sake of argument)
    • Must be able to ensure 7x24 service (high reliability)
    • Latency must be as low as possible
    • Government receives data at no cost but could/should cost share overall expenses in light of benefits to NCDC (direct ingest for long-term archive), NCEP, FSL, NWS Offices (Level II recorders)
    • Educational institutions receive data at no cost
    • Presumably don’t want another “NIDS arrangement”
  • Options
    • For-profit private company
    • University-based consortium
    • Not-for-profit 501(c)3
    • University-based center (e.g., Wisconsin for satellite data)
    • Others?
key items for discussion
Key Items for Discussion
  • Sustaining the operation of CRAFT beyond November
  • Establishing private sector requirements
    • Reliability
    • Latency
    • Hardware and software support
  • Meeting private (and academic) sector needs in the short, medium and long term
  • Administrative issues (including data access rules)
  • Dealing with future data volumes
  • Further analysis of system capabilities
    • Impact of weather on data reliability/latency
    • Networking simulation