1 / 16

UK e-Science Grid

UK e-Science Grid Dr Neil Geddes CCLRC Head of e-Science Director of the UK Grid Operations Support Centre. Situation Today. National Grid Service. Level-2 Grid. * Leeds. Manchester *. * DL. * Oxford. RAL *. In the Future. N G S. UK Grid Operations Support Centre. ETF. NGS.

clarke
Download Presentation

UK e-Science Grid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UK e-Science Grid Dr Neil Geddes CCLRC Head of e-ScienceDirector of the UK Grid Operations Support Centre

  2. Situation Today National Grid Service Level-2 Grid * Leeds Manchester * * DL * Oxford RAL *

  3. In the Future N G S UK Grid Operations Support Centre

  4. ETF NGS Other software sources Software with proven capability & realistic deployment experience OMII ‘Gold’ services Prototypes & specifications Feedback & future requirements GOSC • Core of GOSC built around experience in deploying and running National Grid Service (NGS) • Support service • Important to coordinate and integrate this with deployment and operations work in EGEE, LCG and similar projects. • EGEE – low level services, CA, GOC, CERT... • Dedicated deployment and operations management will be a key component UK Campus and other Grids EGEE… Operations Deployment/testing/advice

  5. GOSC Roles • Core UK Grid services: Simple Registry, data transfer, Job Submission, security, Data Access • Key services (to be supported for all Grids): Authorisation, Notification, Workflow, Monitoring and Accounting, Grid Management Services, VO support • Services to be coordinated with others (eg OMII, NeSC, LCG): Integration testing, compatibility & Validation Tests, User Management, training Timeline: May/June • Develop a roadmap for the development of the grid operations centre over the next two years June • Develop the GOSC proposal for next two years. • including deployment and support plans, interfaces to related projects and integration of non core resources into NGS, and criteria for service evaluation. September • Formal start of GOSC October • NGS “production service” • Compatibility with EGEE

  6. MPI, co-scheduling … Oxford and Leeds (White Rose Grid)

  7. Manchester and CCLRC-RAL

  8. Also includes: • http://www.csar.cfs.ac.uk/ • 256 Itanium2 processor SGI Altix • 512 processor Origin3800 http://www.hpcx.ac.uk/ Full installation = 1600 IBM p690+ Regatta processors currently 1236 processors EMBL Nucleotide Sequences NCBI, BLAST, EMBOSS, FASTA, Gaussian • Thus, the NGS provides access to over 2000 processors, over 36TB of "data-grid" • capacity, common scientific applications and extensive data archives. • Other resource providers anticipated to join in the future …

  9. NGS Status 28 May 2004 • All 4 cluster nodes operational • Announcement made on 5th April, to user communities in UK • “Pre-production” • ETF, NeSC, HPCx, and also via JISC web site • Common grid-mapfile for user management. • VOM server installation abandoned. • VOMS is now under review & installation by Grid Support Centre. • Information service operating using BDII infrastructure. • GridIce installation to give front end interface to BDII data. • GRidMon work to allow NGS monitoring, almost complete. • Federated Ganglia operational to allow centralised monitoring of resources and load across the four JISC/CCLRC sites.

  10. Ab Initio Molecular Orbital Theory eMaterials The DL_POLY Molecular Simulation Package What is Available?

  11. Users and Projects Users • 63 Users registered (excluding sysadmins etc…) • Leeds, Oxford, UCL, Cardiff, Southampton, Imperial, Liverpool, Sheffield, Cambridge, Edinburgh, QUB, BBSRC and CCLRC. Projects • e-Minerals • e-Materials • Orbital Dynamics of Satellite Galaxies (Oxford), • Bioinformatics (using BLAST), (John Owen BBSRC) • GEODISE project (Southampton) • Singlet meson project within the UKQCD collaboration (QCDGrid Liverpool + Edinburgh), • Census data analysis for geographical information systems (Sheffield) • MIAKT project (image registration for medical image anaylsis – Imperial) • e-HTPX project. • RealityGrid – Computational chemistry

  12. More than just computation and data resources… • In future will include services to facilitate collaborative (grid) computing • Authentication (PKI X509) • Job submission/batch service • Resource brokering • Authorisation • Virtual Organisation management • Certificate management • Information service • Data access/integration services (SRB/OGSA-DAI/DQPS) • National Registry (of registry’s) • Data replication • Data caching • Grid monitoring • Accounting

  13. NGS Resources • Free at the point of use (for UK e-Science) • Site contributions defined by Service Level Description • 4+2 core sites now • Cardiff and Bristol any day now • Anyone can join • Agree to the Base SLD • VDT (RB/VOM) … EGEE • Level of support/resource up to providers • Access conditions up to providers • Needs VO management and monitoring/accounting • Common core services

  14. Concluding Remarks • UK e-Science grid in pre-production mode • Goal is to provide national services • Not just access to storage + CPU • Likely to be a component of UK computing provision • Compatibility with EGEE,TeraGrid etc. important

  15. The End

More Related