slide1 n.
Skip this Video
Download Presentation
Clusters: Changing the Face of Campus Computing

Loading in 2 Seconds...

play fullscreen
1 / 17

Clusters: Changing the Face of Campus Computing - PowerPoint PPT Presentation

  • Uploaded on

Clusters: Changing the Face of Campus Computing. Kenji Takeda School of Engineering Sciences Ian Hardy Oz Parchment Southampton University Computing Services Simon Cox Department of Electronics and Computer Science. Talk Outline. Introduction Clusters background Procurement

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Clusters: Changing the Face of Campus Computing' - eara

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Clusters: Changing the Face of Campus Computing

Kenji Takeda

School of Engineering Sciences

Ian Hardy

Oz Parchment

Southampton University Computing Services

Simon Cox

Department of Electronics and Computer Science


Talk Outline

  • Introduction
  • Clusters background
  • Procurement
  • Configuration, installation and integration
  • Performance
  • Future prospects
  • Changing the landscape
  • University of Southampton
    • 20,000+ students (3000+ postgraduate)
    • 1600+ academic and research staff
    • £182 million turnover 1999/2000
"to acquire, support and manage general-purpose computing, data communications facilities and telephony services within available resources, so as to assist the University to make the most effective use of information systems in teaching, learning and research activities".
hefce computational and data handling project
HEFCE Computational and Data Handling Project
  • Existing facilities outdated and overloaded
  • £1.01 million total bid, including infrastructure costs and Origin 2000 upgrade
  • Large compute facility to provide significant local HPC capability
  • Large data store – several Terabytes
  • Upgraded networking: Gigabit to the desktop
  • Staff costs to support new facility
cluster computing
Cluster Computing
  • Extremely attractive price/performance
  • Good scalability achievable with high performance memory interconnects
  • Fast serial nodes with lots of memory (up to 4 Gbytes) affordable
  • High throughput, nodes are cheap
  • Still require SMP for large (>4 Gigabytes) memory jobs – for now
clusters at southampton
Clusters at Southampton
  • ECS:8 node Alpha NT and 8 node AMD Athlon clusters
  • Social Statistics/ECS/SUCS: 19 node Intel PIII cluster
  • Chemistry: 39 AMD Athlon and 4 dual Intel PIII node cluster
  • Computational Engineering and Design Centre: 21 dual node and 10 dual node Intel PIII clusters
  • Aerodynamics and Flight Mechanics Group: 11 dual node Intel PIII cluster with Myrinet 2000
  • ISVR: 9 dual node Intel PIII Windows 2000 cluster
  • Several high throughput workstation clusters on campus
  • Windows Clusters research
user profiles
User Profiles
  • Users from many disciplines:
    • Engineering, Chemistry, Biology, Medicine, Physics, Maths, Geography, Social Statistics
  • Many different requirements:
    • Scalability, memory, throughput, commercial apps
  • Want to encourage new users and new applications
  • Ask users what they want - open discussion
  • General-purpose cluster specification
  • Open tender process
  • Vendors, from big iron companies to home PC suppliers
  • Shortlist vendors for detailed discussions
  • Varied user requirements
  • Limited budget – value for money crucial
  • Heterogenous configuration optimum
  • Balanced system: CPU, memory, disk
  • Boxes-on-shelves or racks?
  • Management options: serial network, power strips, fast ethernet backbone
iridis cluster
IRIDIS Cluster
  • Boxes-on-Shelves
  • 178 Nodes
    • 146 × dual 1GHz PIIIs
    • 32 × 1.5GHz P4s
  • Myrinet2000
    • Connecting 150 cpu’s
  • 100 Mbit/s fast Ethernet
  • APC Power strips
  • 3.2 Tb IDE-Fibre disk
installation integration
Installation & Integration
  • Initial installation by vendor – Compusys plc
  • One week burn-in, still had 3 DOAs
  • Major switch problem fixed by supplier
  • Swap space increased on each node

No problems since

  • Pallas, Linpack, NAS benchmarks and user codes for thorough system shakedown
  • Scheduler for flexible partitioning of jobs
nas serial benchmarks
NAS Serial Benchmarks

Bigger is better

chemistry codes
Chemistry Codes

Smaller is better

future prospects
Future Prospects
  • Roll-out Windows 2000/XP service
    • In response to user requirements
    • Increase HPC user-base
    • Drag-and-drop supercomputing
  • Expand as part of Southampton Grid
    • Integration with other compute resources on and off campus
    • Double in size over next few years
changing the landscape
Changing the Landscape
  • Availability of serious compute power to many more users – HPC for the masses
  • Heterogenous systems - tailored partitions for different types of users easy to cater for
  • Compatability between desktops and servers improved – less intimidating
  • New pricing model for vendors – costs are transparent to the customer

Affordable, Expandable, Grid-able