1 / 44

The impact of grid computing on UK research

The impact of grid computing on UK research. R Perrott Queen’s University Belfast. http://. Web : Uniform access to HTML documents. http://. Software catalogs. Computers. Sensor nets. Colleagues. Data archives. The Grid: The Web on Steroids.

thuyet
Download Presentation

The impact of grid computing on UK research

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The impact of grid computing on UK research R Perrott Queen’s University Belfast

  2. http:// Web: Uniform access to HTML documents http:// Software catalogs Computers Sensor nets Colleagues Data archives The Grid: The Web on Steroids Grid: Flexible, high-perf access to all significant resources On-demand creation of powerful virtual computing systems

  3. Why Now? • The Internet as infrastructure • Increasing bandwidth, advanced services • Advances in storage capacity • Terabyte for < $15,000 • Increased availability of compute resources • Clusters, supercomputers, etc. • Advances in application concepts • Simulation-based design, advanced scientific instruments, collaborative engineering, ...

  4. Grids • computational grid • provides the raw computing power, high speed bandwidth interconnection and associate data storage • information grid • allows easily accessible connections to major sources of information and tools for its analysis and visualisation • knowledge grid • gives added value to the information and also provides intelligent guidance for decision-makers

  5. Grid Architecture Data to Knowledge Knowledge Grid Information Grid Computation & Data Grid Communications Control

  6. Program suite that can search out data and services, to satisfy job requirements. User-level API and libraries. Application Layer Schedulers to launch the work in the right places. The authentication, authorisation, etc. to allow it to happen. Middleware Local resources that form local contributions to federal resource. Base Layer Application Users Software Suppliers

  7. UK Research Councils Approx.. funding for 2000/01 (£M) • Biotechnology and Biological Sciences 200Research Council (BBSRC) • Engineering and Physical Sciences 400Research Council (EPSRC) • Economic and Social Research Council (ESRC) 70 • Medical Research Council (MRC) 350 • Natural Environment Research Council (NERC) 225 • Particle Physics and Astronomy 200Research Council (PPARC) • Council for the Central Laboratory of the 100Research Councils

  8. UK Grid Development Plan • Network of Grid Core Programme e-Science Centres • Development of Generic Grid Middleware • Grid Grand Challenge Project • Support for e-Science Projects • International Involvement • Grid Network Team

  9. 1. Grid Core Programme Centres • National e-Science Centre to achieve international visibility • National Centre will host international e-Science seminars ‘similar’ to Newton Institute • Funding 8 Regional e-Science Centres to form coherent UK Grid • DTI funding requires matching industrial involvement • Good overlap with Particle Physics and AstroGrid Centres

  10. Edinburgh Glasgow DL Newcastle Belfast Manchester Cambridge Oxford RL Hinxton Cardiff London Soton

  11. Access Grid Centres will be Access Grid Nodes • Access Grid will enable informal and formal group to group collaboration • It enables: • Distributed lectures and seminars • Virtual meetings • Complex distributed grid demos • Will improve the user experience (“sense of presence”) - natural interactions (natural audio, big display)

  12. 2. Generic Grid Middleware • Continuing dialogue with major industrial players - IBM, Microsoft, Oracle, Sun, HP .. - IBM Press Announcement August 2001 • Open Call for Proposals from July 2001 plus Centre industrial projects • Funding Computer Science involvement in EU DataGrid Middleware Work Packages

  13. 3. Grid Interdisciplinary Research Centres Project • 4 IT-centric IRCs funded - DIRC : Dependability - EQUATOR : HCI - AKT : Knowledge Management - Medical Informatics • ‘Grand Challenge’ in Medical/Healthcare Informatics - issues of security, privacy and trust

  14. 4. Support for e-Science Projects • ‘Grid Starter Kit’ Version 1.0 available for distribution from July 2001 • Set up Grid Support Centre • Training Courses • National e-Science Centre Research Seminar Programme

  15. 5. International Involvement • ‘GridNet’ at National Centre for UK participation in the Global Grid Forum • Funding CERN and iVDGL ‘Grid Fellowships’ • Participation/Leadership in EU Grid Activities - New FP5 Grid Projects (DataTag, GRIP, …) • Establishing links with major US Centres – San Diego Supercomputer Center, NCSA

  16. 6. Grid Network Team • Tasked with ensuring adequate end-to-end bandwidth for e-Science Projects • Identify/fix network bottlenecks • Identify network requirements of e-Science projects • Funding traffic engineering project • Upgrade SuperJANET4 connection to sites

  17. Network Issues • Upgrading SJ4 backbone from 2.5 Gbps to 10 Gbps • Installing 2.5 Gbps link to GEANT pan-European network • TransAtlantic bandwidth procurement • 2.5 Gbps dedicated fibre • Connections to Abilene and ESNet • EU DataTAG project 2.5 Gbps link from CERN to Chicago

  18. Early e-Science Demonstrators Funded • Dynamic Brain Atlas • Biodiversity • Chemical Structures Under Development/Consideration • Grid-Microscopy • Robotic Astronomy • Collaborative Visualisation • Mouse Genes • 3D Engineering Prototypes • Medical Imaging/VR

  19. Particle Physics and Astronomy Research Council (PPARC) • GridPP (http://www.gridpp.ac.uk/) • to develop the Grid technologies required to meet the LHC computing challenge • collaboration with international grid developments in Europe and the US

  20. Particle Physics and Astronomy Research Council (PPARC) • ASTROGRID (http://www.astrogrid.ac.uk/) • a ~£4M project aimed at building a data-grid for UK astronomy, which will form the UK contribution to a global Virtual Observatory

  21. EPSRC Testbeds (1) • DAME : Distributed Aircraft Maintenance Environment • RealityGrid : closely couple high performance computing, high throughput experiment and visualization • GEODISE : Grid Enabled Optimisation and DesIgn Search for Engineering

  22. EPSRC Testbeds (2) • CombiChem : combinatorial chemistrystructure-property mapping • MyGrid : personalised extensible environments for data-intensive experiments in biology • Discovery Net : high throughput sensing

  23. Distributed Aircraft Maintenance Environment Jim Austin, University of York Peter Dew, Leeds Graham Hesketh, Rolls-Royce

  24. In flight data Global Network Ground Station Airline DS&S Engine Health Center Maintenance Centre Internet, e-mail, pager Data centre

  25. Aims • To build a generic grid test bed for distributed diagnostics on a global scale • To demonstrate this on distributed aircraft maintenance • Evaluate the effectiveness of grid for this task • To deliver grid-enabled technologies that underpin the application • To investigate performance issues

  26. Computational Infrastructure 3D Interactive Graphics & Conferencing Leeds Local Grid Lab. Machines Onyx 3 teradata Shared Mem. Cluster Super Janet Running Across YHMAN White Rose Computational Grid (SAN) Sheffield Dist. Memory York Shared Memory

  27. ibm MyGrid Personalised extensible environments for data-intensive experiments in biology Professor Carole Goble, University of Manchester Dr Alan Robinson, EBI

  28. Consortium • Scientific Team • Biologists • GSK, AZ, Merck KGaA, Manchester, EBI • Technical Team • Manchester, Southampton, Newcastle, Sheffield, EBI, Nottingham • IBM, SUN • GeneticXchange • Network Inference, Epistemics Ltd

  29. Comparative Functional Genomics • Vast amounts of data & escalating • Highly heterogeneous • Data types • Data forms • Community • Highly complex and inter-related • Volatile

  30. MyGrid e-Science Objectives Revolutionise scientific practice in biology • Straightforward discovery, interoperation, sharing • Improving quality of both experiments and data • Individual creativity & collaborative working • Enabling genomic level bioinformatics Cottage Industry to an Industrial Scale

  31. On the shoulders of giants We are not starting from scratch… • Globus Starter Kit … • Web Service initiatives … • Our own environments … • Integration platforms for bioinformatics … • Standards e.g. OMG LSR, I3C … • Experience with Open Source

  32. Specific Outcomes • E-Scientists • Environment built on toolkits for service access, personalisation & community • Gene function expression analysis • Annotation workbench for the PRINTS pattern database • Developers • MyGrid-in-a-Box developers kit • Re-purposing existing integration platforms

  33. Discovery Net Discovery Net • Yike Guo, John Darlington (Dept. of Computing), • John Hassard (Depts. of Physics and Bioengineering) • Bob Spence (Dept. of Electrical Engineering) • Tony Cass (Department of Biochemistry), • Sevket Durucan (T. H. Huxley School of Environment) • Imperial College London

  34. AIM • To design, develop and implement an infrastructure to support real time processing, interaction, integration, visualisation and mining of massive amounts of time critical data generated by high throughput devices.

  35. Industry Connection : 4 Spin-off companies + related companies (AstraZeneca, Pfizer, GSK, Cisco, IBM, HP, Fujitsu, Gene Logic, Applera, Evotec, International Power, Hydro Quebec, BP, British Energy, ….) The Consortium

  36. Hardware : sensors (photodiode arrays), systems (optics, mechanical systems, DSPs, FPGAs) Software (analysis packages, algorithms, data warehousing and mining systems) Intellectual Property: access to IP portfolio suite at no cost Data: raw and processed data from biotechnology, pharmacogenomic, remote sensing (GUSTO installations, satellite data from geo-hazard programmes) and renewable energy data (from remote tidal power systems) Industrial Contribution

  37. Distributed Reference DBs Distributed Users Collaborative applications Distributed Devices Distributed warehousing High Throughput Sensing Characteristics • Different Devices but same computational characteristics • Data intensive & • Data dispersive • large scale, • heterogeneous • distributed data • Real-time data manipulation Need to • calibrate • integrate • analyse Discovery issues: Information issues: Data issues: GRID issues:

  38. Throughput (GB/s) Size (petabytes) Node Number operations Testbed Applications HTS Applications Large-scale Dynamic Real- time Decision support Large-scale Dynamic System Knowledge Discovery 1-10 1-10 >20000 Structuring Mining Optimisation RT decisions • Renewable energy Applications • Tidal Energy • Connections to other renewable initiatives • (solar, biomass, fuel cells), & to CHP and baseload stations • Remote Sensing Applications • Air Sensing, GUSTO • Geological, geohazard analysis 1-100 10-100 >50000 Image Registration Visualisation Predictive Modelling RT decisions • Bio Chip Applications • Protein-folding chips: SNP chips, Diff. Gene chips using LFII • Protein-based fluorescent micro arrays 1-1000 10-1000 >10000 Data Quality Visualisation Structuring Clustering Distributed Dynamic Knowledge Management

  39. Large-scale urban air sensing applications GUSTO GUSTO Each GUSTO air pollution system produces 1kbit per second, or 1010 bits per year. We expect to increase the number (from the present 2 systems) to over 20,000 over next 3 years, to reach a total of 0.6 petabytes of data within the 3-year ramp-up. The useful information comes from time-resolved correlations among remote stations, and with other environmental data sets. NO simulant 6.7.2001 You are here

  40. End devices Floor switches Central Computing Facilities Building Router Switches workstation cluster wireless SMP Core Router Switches storage • Access to disparate off-campus sites: IC hospitals, Wye College etc. Proposed Firewall London MAN/ JANET The IC Advantage The IC infrastructure: microgird for the testbed Over than 12000 end devices 10 Mb/s – 1Gb/s to end devices ICPC Resource 1 Gb/s between floors 150 Gflops Processing 10 Gb/s to backbone >100 GB Memory 10 Gb/s between backbone router matrix and wireless capability 5 TB of disk storage £3m SRIF funding Network upgrade +20 TB of disk storage 2x1Gb/s to LMAN II (10Gb/s scheduled 2004) +25 TB of tape storage 3 Clusters (> 1 Tera Flops)

  41. Conclusions • Good ‘buy-in’ from scientists and engineers • Considerable industrial interest • Reasonable ‘buy-in’ from good fraction of Computer Science community but not all • Serious interest in Grids from IBM, HP, Oracle and Sun • On paper UK now has most visible and focussed e-Science/Grid programme in Europe • Now have to deliver!

  42. US Grid Projects/Proposals • NASA Information Power Grid • DOE Science Grid • NSF National Virtual Observatory • NSF GriPhyN • DOE Particle Physics Data Grid • NSF Distributed Terascale Facility • DOE ASCI Grid • DOE Earth Systems Grid • DARPA CoABS Grid • NEESGrid • NSF BIRN • NSF iVDGL

  43. EU GridProjects • DataGrid (CERN, ..) • EuroGrid (Unicore) • DataTag (TTT…) • Astrophysical Virtual Observatory • GRIP (Globus/Unicore) • GRIA (Industrial applications) • GridLab (Cactus Toolkit) • CrossGrid (Infrastructure Components) • EGSO (Solar Physics)

More Related