1 / 18

OxGrid, A Campus Grid for the University of Oxford

OxGrid, A Campus Grid for the University of Oxford. Dr. David Wallom. Outline. Why make a campus grid? How we are making it? Computational capability Data capability. Why a grid?.

locke
Download Presentation

OxGrid, A Campus Grid for the University of Oxford

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OxGrid, A Campus Grid for the University of Oxford Dr. David Wallom

  2. Outline • Why make a campus grid? • How we are making it? • Computational capability • Data capability

  3. Why a grid? • Many new problems in research have a need for access to massive computational and data capacity, capability limiting, • If the need is too large for a single existing resource, construct a system able to concurrently use a number of appropriate resources, Designed so that; • use single sign-on to access multiple resources and switch between each seamlessly • layout can be dynamically altered without user interference • once data placed on, or a job started on, a remote resource, its status is monitored to make sure it stays running/available!

  4. Why make a campus grid? • Many computers throughout the University under-utilised: • PCs • Idle time (about 16hr/day for an average desktop) • Unused disk space (~60% of a modern hard-drive) • already purchased – depreciating daily • Readily available resource, e.g. OULS has up to 1200 desktop computers. • Large Servers • expensive to purchase, house and run (extra staff). • Rarely 100% utilised

  5. OxGrid, a University Campus Grid Oxford Supercomputing Centre National Grid Service • Single entry point for Oxford users to shared and dedicated resources • Seamless access to National Grid Service and OSC for registered users • Single sign-on using general UK e-Science network of integrated with current methods Oxford Users OxGrid Central Management Computational task Distribution Storage Management College Resources Departmental Resources

  6. Central System Components • Computational Task Distribution: • Resource Broker, user access and distribution of submitted tasks • Information Service, all system capability and status information on which the resource broker makes decisions • Systems monitoring, graphical presentation of monitoring system for helpdesk interface • User Management, control a virtual community and allow access to various resources • Accounting Service, allow full system and single resource use can be recorded and charged for • Storage Management • Create a dynamic multi-homed virtual file system • Single central controller & large file-store for immediate access • Connected to remote file-systems for access to larger storage capability • Provide metadata mark-up for improved data mining

  7. Virtual Organisation/User Management & Accounting • Grid Security Interface uses a mapping between Distinguished Name (DN) as defined in a Digital Certificate and local usernames on each resource. • Important for each resource a user is expecting to use, his DN is mapped locally. • OxVOM • Custom in-house designed Web based user interface • Persistent information stored in relational database • User DN list retrieved by remote resources using standard tools • Accounting is the basis of a possible charging model

  8. Computational Resources • Core, accessible to all Campus Grid users • Individual Departmental Clusters (dedicated compute resources) • Condor clusters of PCs (cycle scavenging) • External, accessible to users that have registered with them • National Grid Service • OSC

  9. Data Management • Engagement of data as well as computationally intensive research groups • Provide a remote store for those groups that cannot resource their own • Distribute the client software as widely as possible, including departments that are not currently engaged in e-Research

  10. Data Management • Software for creation of system • Storage Resource Broker to create large virtual datastore • Through central metadata catalogue users interface with single virtual file system though physical volumes may be on several network resources • In built metadata capability

  11. SRB Architecture USER MCAT Mcat Server Disk Server1 Disk Server2 Disk Server3 Disk Server4

  12. SRB as a Data Grid DB MCAT SRB SRB SRB SRB SRB SRB • Data Grid has arbitrary number of servers • Complexity is hidden from users

  13. SRB Client Implementations • inQ – Window GUI browser • Jargon – Java SRB client classes • Pure Java implementation • mySRB – Web based GUI • run using web browser • Matrix – Web service for SRB work flow • All of these allow direct interaction with the data-grid

  14. Users • Installed several example applications • Graphics rendering

  15. Use of Computing Power in the Humanities

  16. Users • Installed several example applications • Graphics rendering • Physics • Biochemistry • Computational Users • Chemistry & Materials Science • Data Users • IBVRE • Contacting currently registered users of both OSC as well as NGS. • Beneficial to these systems to remove users that don’t need to be there to provide more capability to those that must be there. • Data provision is an integral component of the grid • Starting to contact large data users

  17. Conclusions • Users are already able to log onto the Resource Broker and schedule work onto the NGS, OSC and OUCS Condor Systems • We are working as quickly as possible to engage more users • We need these users to then go out and evangelise to bring in both more users and resource.

  18. Contact • Email: david.wallom@ierc.ox.ac.uk • Telephone: 01865 283378

More Related