1 / 33

Grid Operations Support Centre and UK National Grid Service What Next ?

Grid Operations Support Centre and UK National Grid Service What Next ?. Neil Geddes GridPP15,Janury 2006. Outline Current Status Plans for the coming year NGS-2 working with. Introduction. The “Science and Innovation Investment Framework 2004-2014”

rowdy
Download Presentation

Grid Operations Support Centre and UK National Grid Service What Next ?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Grid Operations Support Centre andUK National Grid ServiceWhat Next ? Neil Geddes GridPP15,Janury 2006

  2. Outline Current Status Plans for the coming year NGS-2 working with

  3. Introduction • The “Science and Innovation Investment Framework 2004-2014” • a national e-Infrastructure (hardware, networks, communications technology) to provide “ready and efficient access to information of all kinds – such as experimental data sets, journals, theses, conference proceedings and patents”. • Critical to successful collaborative, multi-disciplinary research and innovation • ‘Over the decade many of the grand challenges in research will occupy the interfaces between the separate research disciplines developed in the 19th and 20th centuries… much more needs to be done, and by more players, if the UK is to achieve a global edge’. • The NGS and GOSC explicitly addresses exactly these issues • Common interfaces and operational procedures provide the basis of • efficient sharing of resources • Simple users access to an increasing range of resources • key step towards a service economy for data and computation.

  4. The National Grid Service Launched April 2004 Full production - September 2004 Focus on deployment/operations Do not do development Responsive to users needs + Belfast, Westminster …

  5. NGS Users

  6. 1 2 3 4 5 6

  7. SRB Storage history for month prior to 31/08/05 Detailed information -> https://www.ngs.ac.uk/ops/gits/srb/srbreport.txt

  8. Users by institution IB INRIA IB

  9. Users by “Research Council”

  10. Execute workflow Log in Visualise execution Create workflow Map execution P-GRADE NGS Portal http://www.cpc.wmin.ac.uk/ngsportal The P-GRADE NGS portal, operated by the University of Westminster, offers an alternative to the NGS Portal for executing and monitoring computational jobs on the UK National Grid Service. In addition, it enables the graphical development, execution and visualisation of workflows – composed of sequential and parallel jobs – on the NGS resources. http://www.cpc.wmin.ac.uk/ngsportal

  11. Browse repository Publish legacy code Create workflow GEMLCA - Legacy Code Support for NGS Users http://www.cpc.wmin.ac.uk/gemlca • If you have a legacy application that you would like to make accessible for other NGS users utilise GEMLCA to: • upload your application into a central GEMLCA repository • make it available for authorised users through the P-GRADE NGS portal http://www.cpc.wmin.ac.uk/gemlca

  12. GODIVA Diagnostics Study of Oceanography Data EVE Excitations and Visualisation Project Integrative Biology Simulation of Strain in Soft Tissue under Gravity

  13. The results reveal biologically important patterns of sequence-dependent flexibility. PDB2MD: An automated pipeline performs molecular dynamics simulations on DNA crystal structures. Charlie Laughton School of Pharmacy University of Nottingham 116d 180d 196d 1d56 PDB database of DNA crystal structures AMBER running on NGS MD database & analysis

  14. The results reveal biologically important patterns of sequence-dependent flexibility. PDB2MD: An automated pipeline performs molecular dynamics simulations on DNA crystal structures. Charlie Laughton School of Pharmacy University of Nottingham 1ilc 116d 116d 180d 180d 196d 196d 1d56 PDB database of DNA crystal structures AMBER running on NGS MD database & analysis

  15. Reality Grid Example

  16. Condor + BLAST Condor wrapper Condor Central Manager NESC Condor pool GT2.4 + BLAST GT2.4 + BLAST Leeds headnode Oxford headnode execution hosts execution hosts BRIDGES GridBLAST Job Submission ScotGRID worker nodes ScotGRID masternode NESC Grid Server (Titania) end user machine PBS server side + BLAST send job request GT 3 core grid service GridBLAST client return result jobs farmed out to compute nodes PBS wrapper BRIDGES Meta-Scheduler Apache Tomcat The BRIDGES projectMicha BayerNeSC-Glasgow GT2.4 wrapper NGS

  17. Roadmap • Goals : • Support the current service functionality and user base • Expansion to include new partners, • Improve interoperability with EGEE, TeraGrid and DEISA • Convergence with project/community/campus infrastructure • Provision of value added services on top of the basic NGS infrastructure • Specific Targets (services, operations, technology) • Improved operational security and incident response procedures • Deployment of resource broker ? • Virtual Organisation Management Service (VOMS) • Incremental support for the LCG Baseline Services.☺ • A co-scheduling system ? • Job submission to the NGS through the emerging JSDL standard  • Support for access to a range of data storage services ☺ • Resource accounting and improved grid account management  • Provision of a test system for dynamic service hosting.  • Services from the OMII managed programme.☺ • Initial integration with Shibboleth authentication ? • First examples of “authorisation aware” generic services ?

  18. What Next

  19. Regional and Campus grids Community Grids VRE, VLE, IE HPCx + HECtoR LHC ISIS TS2 UK e-Infrastructure Users get common access, tools, information, Nationally supported services, through NGS Integrated internationally

  20. The Vision of the National Grid Service • Integrated coherent electronic access for UK researchers to all resources and facilities that they require to carry out their research. • From advanced real time facilities: real time instruments to historical data • Supporting access to regional, national and international facilities, • Integrating institutional or even departmental resources • Examples of the services: • Location independent data storage and replication • Location independent access to institutional repositories and certified long term archival • Location independent access to local, regional, national and international data resources • Access to local, regional, national and international computational resources • Internationally recognized electronic identity and identity management tools • Tools for managing collaborative project or Virtual Organisation based authentication and authorisation • Co-scheduling and operation of a wide range of national and international resources. • Tools to support distributed collaborative working • In addition the GOSC will support • 24 hour monitoring of the UK’s grid infrastructure • The policy framework for operations • Review of partner services. • A central UK help desk and support centre • A repository of user and system documentation,

  21. Next 3 years • The GOSC and NGS development will lead to: • An expanded National Grid Service for UK research • partners and affiliates • range of services provided. • Robust NGS operations and management procedures. • Interoperability with other national and international e-infrastructures • Integration of key data sources and data services including • Data Centre - Edina, Mimas, AHDS, • Facilities - LHC, DIAMOND and ISIS. • Improved and measured NGS reliability • Supported scientific research over next 3-5 years • Value added services deployed on the basic NGS infrastructure. • Detailed service definitions for sustainable infrastructure

  22. In more detail…

  23. Support Centre • Front line e-infrastructure support for UK academic community. • Authentication framework required by national and international agreement. • Infrastructure services • authentication, VO/authorisation, information, monitoring, resource brokering • Support interfaces with partner infrastructures • within the UK: e.g. MIMAS, Edina, AHDS, GridPP, e-minerals, and others • Internationally: e.g TeraGrid, EGEE • Website, including documentation, training materials and related links • Work with NeSC and other organisations to provide training • Coordinate development and deployment programme for NGS. • Development expertise • User management for the NGS • Partner and affiliate management for the NGS,

  24. National and International Facilities • UK gateway: • Regional, National and international HPC facilities • HECToR, TeraGrid and DEISA. • Advanced national experimental facilities • DIAMOND, ISIS and the National Crystallographic Service. • National, regional and institutional data centres. • Campus grid developments • Grid infrastructures • Grid Interoperability workshop at SC05 • EGEE, TeraGrid, OSG, NGS, Naregi, IPAC • First of a regular series – next meeting at GGF in March • Some key proto-agreements • JDSL, SRM, GridFTP, GSI and VOMS,“GLUE” (and CIM),RU records • GOSC will • Integrate access to data from experimental facilities. • Support collaborative activities spanning each (or all) of these infrastructures • Where possible, deploy a robust cross-service scheduling system • Common tools and infrastructures for data handling, organisation, manipulation and creation of user interfaces.

  25. Leaders from TeraGrid, OSG, EGEE, APAC, NAREGI, DEISA, Pragma, UK NGS, KISTI will lead an interoperation initiative in 2006. Six international teams will meet for the first time at GGF-16 in February 2006 Application Use Cases (Bair/TeraGrid, Alessandrini/DEISA) Authentication/Identity Mgmt (Skow/TeraGrid) Job Description Language Newhouse/UK-NGS Data Location/Movement Pordes/OSG Information Schemas Matsuoka/NAREGI Testbeds Arzberger/Pragma Grid Interoperation Leaders from nine Grid initiatives met at SC05 to plan an application-driven “Inerop Challenge” in 2006.

  26. Integration of Computational and Data Resources • Work with major and representative data providers. • Provide a common gateway as the basis for continued developments • Progressively more of the data providers should be incorporated • To develop the necessary integration services the GOSC will work with research communities that require data integration, for example, environmental, biomedical and socio-economical research. • The NGS will support high-level tools, driven by metadata and abstractions that suit the research disciplines, working in conjunction with OMII-UK and international tool developers to promote and understand these developments. • The MIMAS and EDINA data centres are already, or will soon be, funded explicitly to “grid enable” some of their resources. GOSC will work closely with these data centres to ensure that these developments and compatible with and integrated into the NGS services.

  27. Core NGS nodes • Core NGS nodes • Provide a controlled test environment for new services • Provide a high level of, professionally managed, central services • Facilitate the rapid technical development of the NGS infrastructure • Provide a common infrastructure and proving ground for new users • Drive a technical agenda, developing experience and best practice • Represent a neutral core around which partners can converge • Provide resources to guarantee access for new grid users • Positioned between HPC and campus grid • Upgrade existing core nodes in 2007

  28. Accounting and Charging • Accounting and metering is already important for the NGS • Monitoring of the infrastructure is important for operations • Monitoring of service availability/performance for partnership • Monitoring and accounting of service/resource usage • Effective distributed accounting/metering provides the basis by which partners can have confidence in resource sharing and in additional resource provision and consumption • This work will continue and the infrastructure to support this will continue to be developed. • NGS must, at very least, provide a mechanism for parters to “trade” • FEC currently a complication • Partners must recover their investment (loan) • Users have no “money” • NGS sustainability

  29. Integration with Computing Service Providers • Factors for closer integration with computing service providers are: • Well defined interfaces and procedures • Clear understanding of the roles and commitments • A well understood core software stack with components of known provenance • Effective metering and auditing of services and users • Integration of the NGS AAA framework with the standard national and campus systems • Training, documentation and awareness raising targeted at computing services • The role of UKERNA is key here. UKERNA has extensive experience in dealing with the service providers in question and excellent connections into these institutions. • GOSC will work with UKERNA and other key groups such as UCISA and RUGIT • As the number and integration of partners grows, need to integrate better with the UCISA community

  30. Outreach and Training • UK needs training as • Support for decision makers • Support for e‑Infrastructure providers • Support for users • Support for application, portal and tool developers • Support for educators • Essential to coordinate and leverage work of other bodies • JISC, RIN, NeSC, EGEE …

  31. OMII • The initial OMII releases were of limited value to NGS • experimental deployments of OMII software have been installed on the NGS since November 2004. • OMII 2.0 has more interesting things • OGSA-DAI WS-I, GridSAM • OMII-UK brings in more • OGSA-DAI and myGrid • OMII-UK goal is to supply interoperable Web Services (Linux and Microsoft platforms) • GOSC fully support these goals • NGS will: • Continue its deployment of OGSA-DAI on the data nodes • Provide access to the NGS compute resources through the GridSAM • Deploy the Resource Usage Service • In addition GOSC and OMII will work together to • Provide services that can be integrated into workflows controlled through (OMII-UK) tools • Improve management of web services. • Improve accounting for grid service use • Integrate Shibboleth and other VO tools used by GOSC. • Interoperability of OMII software with existing and likely future NGS infrastructure remains a key requirement.

  32. Next 2-3 Years • Notes: • PI time (20%), Director (100%), Technical Director (50%), technical administration (100%) and secretarial support (100%). • All GOSC roles except the explicit support for core NGS nodes, data service integration, community gateways and management. • A full cost assuming 4 nodes @£600k each plus 3 years system administration effort for each. • Data centre integration expertise, matching computational expertise form the Core NGS nodes. • Funding for “application” or “community” gateway activities. It is essential that some form of activity like this be supported in some way. • UKERNA work on networking, security and operational best practice is not explicitly included above. Provisional Budget

  33. Relation to GridPP • NGS and GridPP partners in EGEE • Joining NGS brings generic user and admin support • “GridPP” not a partner in NGS, but expect individual GridPP sites to be partners or affiliates • Build and support broader communities locally • LCG(UK) eventually a logical view of the UK grid • Partners can offer a wide range of services

More Related