1 / 16

CSG - Research Computing Redux

CSG - Research Computing Redux. John Holt, Alan Wolf University of Wisconsin - Madison. What we have:. Infrastructure Projects and Facilities Support. Infrastructure. 21st Century Network 10 Gb backbone Fully online (05-06) IBM Enterprise Storage Server (ESS)

minh
Download Presentation

CSG - Research Computing Redux

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSG - Research Computing Redux John Holt, Alan Wolf University of Wisconsin - Madison

  2. What we have: • Infrastructure • Projects and Facilities • Support

  3. Infrastructure • 21st Century Network • 10 Gb backbone • Fully online (05-06) • IBM Enterprise Storage Server (ESS) • Northern Tier Network Consortium • Fiber connections for the northern states • Wisconsin to Washington • Wisconsin’s connection live by Fall

  4. Abilene & NLRAbilene is the Internet2 network

  5. Abilene, NLR & NTNC

  6. Projects and Facilities • Wisconsin Advanced Internet Laboratory • Condor • Grid Laboratory of Wisconsin (GLOW) • Genome Center Supercomputer facility • Others • Sampling of other projects

  7. Wisconsin Advanced Internet Laboratory • http://wail.cs.wisc.edu/ • 50 IP routers and switches • 100 end hosts • Other networking gear • create arbitrary configurationsof systems

  8. Condor • Hunting for wasted computing resoruces • http://www.cs.wisc.edu/condor/ • Engineering 587 (mixed) • CS approximately 750 (dedicated) • Plus most desktops in the department • Untapped pools • Infolabs and administrative machines • Difficulty with software licenses • Privacy and security

  9. Condor in CS

  10. CS Condor pool usage

  11. GLOW • Another Condor cluster • 330 cpus • (mostly) Isolated and restricted to contributing PIs • Mostly genomics research

  12. Genome Center of Wisconsin • Donated hardware (36 cpu e10000) • V880 (8 cpu) • Aging, but useful for large memory model applications not suitable for clusters • Local mirror of NCBI genomic databases • Subscription model

  13. Others • Laboratory for Optical and Computational Instrumentation • http://loci.wisc.edu/ • http://openmicroscopy.org • Social Science Computer Cooperative • Small Condor and statistical analysis • http://www.ssc.wisc.edu/ • Space Science and Engineering Center • http://www.ssec.wisc.edu/ • Sustainability and Global Environment • http://www.sage.wisc.edu/

  14. Others, continued • Digital Library Projects • Provide access to rare and fragile resourcesfor research • http://uwdcc.library.wisc.edu/index.html • High Energy Physics - GLOW • http://www.hep.wisc.edu/computing • Medical Physics - GLOW • http://www.medphysics.wisc.edu/cluster

  15. Support • Biology New Media Center • Research and teaching • Confederation of DoIT, CBE, and Biotech Center • Centrally supported • Gigabit connectivity to the desktops • Research and learning technology consulting • 3D molecular modeling, biosequence analysis and phylogenetic analysis consulting • Research communication • Access to shared specialized resources

  16. Support (continued) • Research storage support • Local > Shared > Archival • DSpace (or something like it) • Provide storage for collected final works • Datasets, research reports, etc. • Make these available to all researchers and teachers

More Related