1 / 20

UK eScience BoF Session AHM – Nottingham - September 2003

UK eScience BoF Session AHM – Nottingham - September 2003. “Intersecting UK Grid and EGEE/LCG/GridPP”. BoF Agenda. Applications & Requirements Technical Exchanges & Collaboration Common Strategies / Roadmap Discussion. Applications…. What does the eScience Grid currently look like?.

chiko
Download Presentation

UK eScience BoF Session AHM – Nottingham - September 2003

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UK eScience BoF Session AHM – Nottingham - September 2003 “Intersecting UK Grid andEGEE/LCG/GridPP” GridPP8 – Bristol – 23rd September 2003

  2. BoF Agenda • Applications & Requirements • Technical Exchanges & Collaboration • Common Strategies / Roadmap • Discussion GridPP8 – Bristol – 23rd September 2003

  3. Applications… GridPP8 – Bristol – 23rd September 2003

  4. What does the eScience Grid currently look like? Globus v2 installed at all regional eScience centres. Heterogenous resources (linux clusters, SGI O2/3000, SMP Sun machines) eScience certificate authority Mark Hayes GridPP8 – Bristol – 23rd September 2003

  5. GridPP & EDG Dedicated linux clusters running EDG middleware (globus++) Very homogenous resources Resource broker (based on Condor) LDAP based VO management Mark Hayes GridPP8 – Bristol – 23rd September 2003

  6. Applications GridPP & EDG (e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction Large FORTAN/C++ codes, optimised for Linux (and packaged as RPMs) Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA) Mark Hayes Applications on the eScience Grid E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts) Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab) GENIE - ocean-atmosphere modelling (flocked Condor pools) Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG GridPP8 – Bristol – 23rd September 2003

  7. Technical… GridPP8 – Bristol – 23rd September 2003

  8. Application PP Applications L2 Grid Value added components EDG 2.0 Value added Components Simple User management ??? UK MDS EDG Info VOMS RB Data Man. Only Linux 7.3 in places GT2 GT2 eScience CA Network monitoring Comparison Peter Clarke GridPP8 – Bristol – 23rd September 2003

  9. UK Grid : Deployment Phases • Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk; • Level 1: Resources capable of running the Grid Integration Test Script; • Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS; • Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. • 7 Centres of Excellence • Globus GT3 testbed (later talks and mini workshop) • JISC JCSR resources • Level 4: TBD, probably OGSA based. Rob Allan GridPP8 – Bristol – 23rd September 2003

  10. UK Grid : Key Components • ETF Coordination: activities are coordinated through regular Access Grid meetings, e-mail and the Web site; • Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks; • Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software; • Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres; • Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC; • Access Grid: on-line meeting facilities with dedicated rooms and multicast network access. Rob Allan GridPP8 – Bristol – 23rd September 2003

  11. UK Grid : GT3 Testbeds • 2 testbeds funded: • Edinburgh/ Newcastle/ UCL/ Imperial • OGSA-DAI • E-Materials e-Science pilot demonstrator • AstroGrid application demonstrator • Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading • Tackle issues of inter-working between OGSI implementations • Report on deployment and ease of use • HPCPortal services demonstrator • CCP application demonstrator • E-HTPX e-Science pilot demonstrator • InfoPortal demonstrator using OGSA-DAI Rob Allan GridPP8 – Bristol – 23rd September 2003

  12. UK Grid : Issues to be Tackled A number of areas significant for a production Grid environment have hardly been tackled using GT2. Issues include: • Grid information systems, service registration, discovery and definition of facilities. Schema important; • Security, in particular role-based authorisation and security of middleware; • Portable parallel job specifications; • Meta-scheduling, resource reservation and ‘on demand’; • Linking and interacting with remote data sources; • Wide-area visualisation and computational steering; • Workflow composition and optimisation; • Distributed user, s/w and application management; • Data management and replication services; • Grid programming environments, PSEs and user interfaces; • Auditing, advertising and billing in a Grid-based resource market; • Semantic and autonomic tools; etc. etc. Rob Allan GridPP8 – Bristol – 23rd September 2003

  13. Coordination & Management GridPP8 – Bristol – 23rd September 2003

  14. SR2004 e-Science ‘soft-landing’ • Key e-Science Infrastructure components: • Persistent National e-Science Research Grid • Grid Operations Centre • UK e-Science Middleware Infrastructure Repository • National e-Science Institute (cf Newton Institute) • National Digital Curation Centre • AccessGrid Support Service • e-Science/Grid collaboratories Legal Service • International Standards Activity Paul Jeffreys GridPP8 – Bristol – 23rd September 2003

  15. Post 2006 e-Science ‘soft-landing’ • Components foreseen:- • Baseline OST ‘Core Programme’ and collaborate with JISC/JCSR about long-term support issues • OST should support Persistent Research Grid and e-Science Institute • JISC should support Grid Operations Centre, AccessGrid Support Service • OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity Paul Jeffreys GridPP8 – Bristol – 23rd September 2003

  16. Intersecting UK Grid andEGEE/LCG/GridPPCoordination and Management • GridPP • Very substantial resources (computing and infrastructure) • Built-in international connections .. EGEE, LCG • Need to create ‘production’ service • Possible points of intersection of UK programme with GridPP/EGEE/LCG:- • Resources – shared sites, shared personnel • Grid Operations and Support • Security (operational level – firewalls etc) • Change Management of software suite • OMII • CA • Interface to Europe • Training, dissemination • Collection of requirements Paul Jeffreys GridPP8 – Bristol – 23rd September 2003

  17. Discussion GridPP8 – Bristol – 23rd September 2003

  18. Discussion • EDG not yet heterogeneous, OGSA, timelines, packaging issues • Get people working together as investment for future • Working parties : Security, Resource Brokering, Information Services, Workflow, Ops & Support • Avoid being too ambitious or complex • Need over-arching strategic / roadmap view • Outcomes : • Pursue grass-roots collaboration & strategic view in parallel • Set up working party to recommend common elements of roadmap • Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation. GridPP8 – Bristol – 23rd September 2003

  19. Application PP Application L2 Grid PP Grid GT2 GT2 Status “Shared” Engineering “Common Core” Layer ??? LCG-1/EGEE-0 ??? (GT2) ??? OGSA ??? GridPP8 – Bristol – 23rd September 2003

  20. Conclusions • Possible route for technical collaboration • Shared engineering – “working together” • Common objective in OGSA (similar timescales) • No overall co-ordination - yet • Much still to do to make it happen !! GridPP8 – Bristol – 23rd September 2003

More Related