1 / 21

Developments in the Louisiana State Grid (LONI)

Developments in the Louisiana State Grid (LONI). Dick Greenwood Louisiana Tech University. DOSAR III Workshop at The University of Oklahoma September 21-22, 2006. LONI Background. In September 2004, the Louisiana State has committed $40M for a state-wide optical network. 40Gb/sec bandwith

tacey
Download Presentation

Developments in the Louisiana State Grid (LONI)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Developments in the Louisiana State Grid (LONI) Dick Greenwood Louisiana Tech University DOSAR III Workshop at The University of Oklahoma September 21-22, 2006

  2. LONI Background • In September 2004, the Louisiana State has committed $40M for a state-wide optical network. • 40Gb/sec bandwith • Spanning 6 Universities and 2 Health Centers: • LSU • Latech • UL-Lafayette • Tulane • UNO • Southern University • LSU Health Centers in • New Orleans • Shereveport

  3. 2001 Plan

  4. Not an Ordinary Optical Network

  5. Systems Deployed IBM p5 Systems – 1 February 2006 – LaTech [bluedawg] AIX v5.3 ● 17 users submitted 307 jobs – 15 March 2006 – Tulane [ducky] AIX v5.2 ● 43 users submitted 2946 jobs – 7 August 2006 – ULL [zeke] AIX v5.3 ● 9 users submitted 286 jobs – 8 September 2006 – UNO [neptune] AIX v5.3 ● 4 users submitted 10 jobs – ~October 2006 – SUBR [lacumba] AIX v5.3 ● 89 total LONI users

  6. LONI Software Stack • 100% TeraGrid compatible • … • Globus Toolkit • Condor • Virtual Data Toolkit (VDT) • … • The complete SW stack still to be finalized

  7. New Dell Linux Clusters To Be Delivered • “Remote” Systems – 132 nodes, four-way Intel “Woodcrest” 2.33 GHz, 4 GB RAM [4.921 TF; 528 GB RAM] – Shipment expected week of 25 September – First deployment will be LSU's system • Next deployment likely LaTech • Central System – 720 nodes, eight-way Intel “Cloverto(w)n” 2.33 GHz, 4 GB RAM [53.683 TF; 2.88 TB RAM] – Shipment expected end of November or early December

  8. Linux Clusters Woodcrest and Cloverto(w)n – Woodcrest is dual-core on single silicon with a shared 4 MB cache – Cloverto(w)n is two dual-core modules in a single module (may change to four cores on single silicon) – Both based on 65 nm technology initially then 45 nm – Both are capable of retiring four floating point operations per clock cycle, rather than two [IBM's POWER5 also does four floats per cycle]

  9. Dell Cluster Description Environmentals: – 208 V / 310 amperes; 64.5 KW; 18 tons of cooling (7,868 cfm) – 6 racks total (4 node, 1 control, 1 storage) – Rack Dimensions: 78.7"Hx23.94"Wx 39.93"D – Each rack has 4 PDUs; 4 L6-30 208v connects (total of 24 L6-30 circuits)

  10. Storage Currently very tight: – /home is 25 GB and /scratch is 280 GB – this limits usability – all served via Network File System (NFS) which is not high performance ● Future: – When central Linux cluster comes, it will include: ● 14.0 TB raw at each “remote” site in one rack ● 225 TB raw at the central site – Will provide central /home storage as well as global /scratch space – Using Lustre filesystem supported by Cluster Filesystems, Inc.

  11. Award of NSF MRI: PetaShare 1 x Overland Storage NEO Series 8000 tape storage system – 400 TB Capacity: $ 153,558 Specifications : Model 500, 12 Native Fibre LTO-3 Tape Drives, 500 Tape Slots, Redundant Power, Remote Access & Control, Touch Screen Front Panel (Quote includes shipping and assembly) Deployment site: Louisiana State University 5 x Overland Storage REO Series 9000 disk storage system - 44 TB capacity each: 5 x $96,144 = $480,720 Specifications : 44TB Raw SATA, 38 TB usable RAID 5, Protecton OS, Remote Access and Control, Rackmount (Quote includes shipping and assembly) Deployment sites: Louisiana State University, Louisiana Tech University, Tulane University, University of Louisiana at Lafayette, Tulane University, University of New Orleans, Total requested equipment cost: $634,278

  12. Tevfik’s Schedule for PetaShare Year 1: The required equipment for the development of PetaShare will be purchased, installed at each participating site, calibrated, and will be integrated with the existing equipment at these sites. In parallel to this process, we will start developing the key technologies necessary for this instrumentation: data-aware storage, data-aware schedulers, and remote data analysis and visualization. Year 2: Technologies developed during the first year will be integrated. Transparent storage selection, data archival, cataloging and remote access techniques will be developed on top of this integrated system. A user friendly and uniform interface will be developed for interaction. Year 3: The first prototypes will be deployed at the participating sites. Testing and further development will be performed. Application groups will start using the new instrumentation actively.  Year 4: The developed system will be ported to other platforms. It will be made publicly available to the community. Commercialization potential will be investigated. We are currently in the process of negotiating with different vendors. The purchase of the equipment may happen sometime in late Decemebr or early January.

  13. THE END

  14. EXTRA SLIDES

  15. Coastal Modeling • Hurricane Track Prediction • Storm Surge Modeling • Predicting Wind Effects • Coastal Erosion Modeling • Emergency Response

  16. SCOOP Project • SCOOP Portal 17

  17. Other Applications • Numerical Relativity • Petroleum Engineering • Computational Fluid Dynamics (CFD) • High Energy Physics • Bioinformatics

  18. LONI Topology – POPs & DWDMs LSU LSU LSU LSU HSC Shreveport LONI POP OTM POP La Tech LONI POP Edwards LONI DWDM Minden LONI DWDM Jackson, MS LONI DWDM Monroe LONI POP Tallulah LONI DWDM Start LONI DWDM 56Km 48Km 64Km 55Km 38Km 23Km 89Km 82Km 35Km 38Km Coushatta LONI DWDM Mendenhall LONI DWDM Northern DWDM Loop Tylertown LONI DWDM Greensburg LONI DWDM 82Km Southern LONI POP 46Km Derry LONI DWDM 13Km Seminary LONI DWDM 61Km 35Km NLR BR LONI POP 11Km Alexandria LONI POP LPB 38Km 72Km Jackson LONI DWDM 11Km Landry LONI DWDM 39Km 24Km Hammond LONI POP 82Km LaPlace LONI DWDM 55Km 80Km ULL LONI POP OTM POP Crowley LONI DWDM 48Km Ramah LONI DWDM 86Km Southern DWDM Loop 54Km 65Km 65Km 70Km Port Barre LONI DWDM 42Km 69Km 13Km Roanoke LONI DWDM 75Km 102Km 20Km LSU HSC NO LONI POP OTM POP Schriever LONI DWDM Tulane LONI POP Franklin LONI DWDM UNO LONI POP Lake Charles LONI POP

  19. LONI Topology - Status LSU LSU LSU LSU HSC Shreveport LONI POP OTM POP La Tech LONI POP Edwards LONI DWDM Minden LONI DWDM Jackson, MS LONI DWDM Monroe LONI POP Tallulah LONI DWDM Start LONI DWDM 56Km 48Km 64Km 55Km 23Km 89Km 38Km 82Km 35Km 38Km Coushatta LONI DWDM Mendenhall LONI DWDM Tylertown LONI DWDM Northern DWDM Loop Greensburg LONI DWDM 82Km Southern LONI POP 46Km Derry LONI DWDM 13Km Seminary LONI DWDM 61Km 35Km NLR BR LONI POP 11Km Alexandria LONI POP LPB 38Km 72Km Jackson LONI DWDM 11Km Landry LONI DWDM 39Km 24Km Hammond LONI POP 82Km LaPlace LONI DWDM 55Km 80Km ULL LONI POP OTM POP Crowley LONI DWDM 48Km Ramah LONI DWDM 86Km Southern DWDM Loop 54Km 65Km 65Km 70Km Port Barre LONI DWDM 42Km 69Km 13Km Roanoke LONI DWDM 75Km 102Km 20Km LSU HSC NO LONI POP OTM POP Schriever LONI DWDM Tulane LONI POP Franklin LONI DWDM UNO LONI POP Lake Charles LONI POP

  20. LONI Computing Power • At each 6 universities: • IBM p5-575 system(112 power proc, 224 GB RAM) • At LSU: • SuperMike (1024 Xeon proc, 1 TB RAM) • SuperHelix (256 Xeon proc, 256 GB RAM) • MiniMike (32 Xeon proc, 32 GB RAM) • SGI Prism (32 proc, 128 GB RAM) • Nemeaux (64 Mac processors) • LSU Campus Grid with ~2000 processors (soon) • At UL Lafayette • LITE (384 proc, 384 GB RAM) • At Latech • CAPS (30 processors) • And several others..

More Related