1 / 23

UABgrid : A campus-wide distributed computational infrastructure

UABgrid : A campus-wide distributed computational infrastructure. University of Alabama at Birmingham UABgrid Architecture Team Jill Gemmill Purushotham Bangalore John-Paul Robinson. Acknowledgments. This work has been supported by:

otylia
Download Presentation

UABgrid : A campus-wide distributed computational infrastructure

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UABgrid : A campus-wide distributed computational infrastructure University of Alabama at Birmingham UABgrid Architecture Team Jill Gemmill Purushotham Bangalore John-Paul Robinson

  2. Acknowledgments This work has been supported by: • Office of the Vice President for Information Technology • Department of Computer & Information Sciences, School of Natural Sciences and Mathematics • Enabling Technology Laboratory, School of Engineering National Science Foundation • ANI-0330543 “NMI Enabled Open Source Collaboration Tools for Virtual Organizations” • NSF ANI-0123937 via SURA-2002-103 Subcontract “UAB Middleware Testbed Program: Integrated Directory Services, PKI, Video, and Parallel Computing” • NSF CNR-0420614 “Computer and Information Sciences Grid Node Research Facility” • Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation

  3. UAB Background • 36 yr. old urban medical research U. • 82 city blocks • 13 schools (= 13 deans) • 17,000 students; 16,000 employees • Students are 26.3 percent African American and 60.2 percent female • 70 research centers • 20th in NIH funding (4th in SE) • $433 million in research funding; doubling every 10 years • No history of centrally supported HPC or other research-oriented computing services (eg Statistics) • The Alabama Supercomputer Authority

  4. What’s a Campus Grid? • Strategic View: • Maximize use of university’s investment in computational resources • Minimize administrative effort involved in campus-wide resource sharing • By leveraging investments in Identity Management, WebISO, Directories, and Network infrastructures • UABgrid is a federation of resource owners who happen to share a common identity provider

  5. UABgrid Partners • Office of VPIT: Sheila Sanders, VPIT; IT Academic Computing: David L. Shealy, Jill Gemmill, John-Paul Robinson, • 128 node cluster; 64 node P3 cluster; desktop condor pool; 6 terabytes IBP storage • Department of Computer and Information Sciences: Tony Skjellum, CIS Chair; Puri Bangalore, Asst. Prof. • 256 processor & 64 processor clusters;, Viz Wall, Parallel Storage System • Engineering Enabling Technology Lab: Bharat Soni, Chair Mechanical Engineering; Alan Shih, ETLab Director • 256 processor and 128 processor clusters; Viz Wall, High Speed Storage Systems

  6. Current UABgrid Applications • BioInformatics • BLAST, Gene Sequence Analysis, Structural Biology, Micro-Array Data Analysis, Visualization • PDE • Automotive & Industrial, Surface Simulations, Optimization • Grid and Middleware Research • Scheduling, Load Balancing, Granular Authorization

  7. UABgrid Architecture Today: Phase I GigaBit

  8. UABgrid Phase II Additional Grid Nodes 10 GigE

  9. Factors Supporting Resource Sharing • Provost and VP Research are being inundated with competing school requests to purchase clusters; • Deans who’ve gotten clusters find themselves losing classroom space to equipment racks and facing large power and AC bills; • Clusters, large databases, schedulers, etc. require expensive expertise

  10. Grid User Management • Grid identity comes from enterpriseauthentication system (“BlazerID”) • WebISO leveraged to provide digital certificate, private key and proxy certs behind the scenes • Grid Portal and Per-System User Accounts Are Provisioned Automatically, saving much administrative effort (Phase I : grid-mapfiles; Phase II : LDAP stored posix accounts + GridShib)

  11. Grids for Mere Mortals • For jobs run repeatedly where only the database or query varies, it is worthwhile to build a user-friendly interface and also to optimize use of resources • Example: BLAST (National Library of Medicine gene sequence matching software) http://www.ncbi.nlm.nih.gov/Education/BLASTinfo/information3.html

  12. Improving the Interface : GridBLAST • Access using BlazerID and password • Queries and Results easily uploaded & downloaded • Web UI can be hosted on any server • Web UI can be written in any development language

  13. Improving Performance: G-BLAST • A native Grid Service Interface for BLAST • G-BLAST provides automatic BLAST algorithm selection based on # of queries, length of queries, size of the database used, and machines available • BLAST algorithms employed: multi-threaded BLAST, database-splitting BLAST (e.g., mpiBLAST), query-splitting BLAST

  14. Users Web Interface Notify (6) Query (1), (7) Client Program Application Information AIS Query (2) Grid Service Interface Scheduler Invoker Response (3) GIS Dispatch (4) Result (5) Resource Information Grid Service … … BLAST1 BLAST2 BLASTn G-BLAST architecture

  15. AIS BLAST Benchmark database Resource Information Resource GIIS/GRIS Jobs Resource Broker Job Submission Agent Analyzer Job ID’s (JIDs) G-BLAST Scheduler Architecture

  16. UABgrid Funding and Management Today • All equipment has been purchased with various grant funds • ETLab has been designated as a campus resource; • ETLab has contracted for 50% of one IT provided unix administrator to manage its clusters • Academic Computing has 2.3 employees and provides other support in addition to HPC • Computer science / NS&M resources are available to other campus computational scienctists • Computer Science has 1 administrator for all CIS systems • Each research department hires its own programmer(s) • Developing sustainable funding model(s) is challenging

  17. Federated Grids • Exploring cross-domain resource sharing scenarios • Federated Identity : experiences in SURAgrid • Federated Attributes : myVocs and GridShib

  18. Resources SURAgridwww.sura.org University of Virginia Grid Portal Digital Certificate Login Resources SURAGrid Portal UVACA Grid Portal BlazerID and password Texas Advanced Computating Center UABgridCA LSUCA Grid Portal Kerberos Login Resources SURAGridCA Bridge Louisiana State University

  19. : a Virtual Organization Service Center • Use of Shibboleth in Grids provides Attribute based Access Control (not just identity) • Example: Faculty may be assigned higher priority in job queues than students • For VO’s the most important attribute is “member of VO ABC”, and VO memberships typically cross domains. • myVocs offers easy, self management for VOs and expects web browser as primary access to resources • Combined with GridShib, myVocs enables VO membership-based access to grid resources

  20. Inside myVocs Attribute Aggregation

  21. Q & A • Jill Gemmill • jgemmill@uab.edu • Further Information: • http://uabgrid.uab.edu

More Related