1 / 30

UK e-Science National e-Science Centre Open Day Prof. Malcolm Atkinson Director nesc.ac.uk

UK e-Science National e-Science Centre Open Day Prof. Malcolm Atkinson Director www.nesc.ac.uk 17 th January 2003. e-Science Leadership. Partnerships e-Science alliance: Edinburgh+Glasgow Physics & Astronomy (2), EPCC, Informatics, Computing Science

Download Presentation

UK e-Science National e-Science Centre Open Day Prof. Malcolm Atkinson Director nesc.ac.uk

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. UK e-Science National e-Science Centre Open Day Prof. Malcolm Atkinson Director www.nesc.ac.uk 17th January 2003

  2. e-Science Leadership • Partnerships • e-Science alliance: Edinburgh+Glasgow • Physics & Astronomy (2), EPCC, Informatics, Computing Science • Capability Computing & e-Science: Edinburgh + CCLRC • UK + EU: Research and Training Projects £70M • GridPP, European Data Grid, AstroGrid, ENACTS, GRIDSTART, RealityGrid, Neuroinformatics Grid, … • QCDOC + QCD Grid • HPC(x) (Edinburgh, IBM, CCLRC: 3.3TFlops) • Scottish Investment £6.7M • ScotGRID, SRIF, eDIKT, Scottish Centre for Genomic Technology and Informatics, … • NeSC set up, launched and running £8M • e-Science Institute • Blue Gene Workshop (Protein Folding & Structure, IBM) • GGF5 & HPDC11(900 people, largest GGF, largest HPDC) • BlueDwarf (IBM p690 server donated for Scientific DB Research)

  3. What is e-Science

  4. UK e-Science From presentation by Tony Hey

  5. Theory Interdependence Computing Experiment What’s Changing Collaboration is Growing Data is Exploding

  6. UK Investment

  7. Total £120 M Over 3 years Plus > £ 56 M for HPCx 6 years UK e-Science Programme (1)2001 - 2003 DG Research Councils Grid TAG E-Science Steering Committee Director Director’s Awareness and Co-ordination Role Director’s Management Role Generic Challenges EPSRC (£15m), DTI (£15m) Academic Application Support Programme Research Councils (£74m), DTI (£5m) PPARC (£26m) BBSRC (£8m) MRC (£8m) NERC (£7m) ESRC (£3m) EPSRC (£17m) CLRC (£5m) £80m Collaborative projects Industrial Collaboration (£40m)

  8. UK e-Science Programme (2)2003 - 2005 DG Research Councils Grid TAG Total > £ 115 M E-Science Steering Committee Director Director’s Awareness and Co-ordination Role Director’s Management Role Generic Challenges EPSRC (£15m), DTI (£15m) Over 2 years Academic Application Support Programme Research Councils (£74m), DTI (£5m) PPARC (£26m) BBSRC (£8m) MRC (£8m) NERC (£7m) ESRC (£3m) EPSRC (£17m) CLRC (£5m) £80m Collaborative projects Industrial Collaboration (£40m)

  9. NeSC

  10. HPC(x) Directors’ Forum Helped build a community Engineering Task Force Grid Support Centre Architecture Task Force UK Adoption of OGSA OGSA Grid Market Workflow Management Database Task Force OGSA-DAI GGF DAIS-WG e-SI Programme training, coordination, community building, workshops, pioneering GridNet NeSC in the UK Nationale-Science Centre Edinburgh Glasgow Newcastle Belfast Manchester Daresbury Lab Cambridge Oxford Hinxton RAL Cardiff London Southampton

  11. NeSC Staff • Senior staff • Prof. Malcolm Atkinson Director • Dr Arthur Trew Deputy Director • Dr Anna Kenway Centre Manager • Ms Gill Maddy Event Manager • Dr Dave Berry Research Manager • Dr Richard Sinnott Technical Director (Glasgow) • Dr Mark Parsons Commercial Director • Mr Stuart Anderson Regional Director • Research partnerships • Dr Bob Mann Institute for Astronomy • Dr Richard Baldock MRC, Human Genetics Unit • Industrial partnerships • Dr Andy Knox IBM Greenock • Dr Dave Pearson Oracle

  12. SHEFC ScotGrid £0.9M eDIKT £2.3M SRIF £2.3M Wellcome Cardiovascular Functional Genomics£5.4 MRC Neuroinformatics Grid £1.5M (Biobank Scottish Spoke) PPARC AstroGrid £5M GridPP £17M EPSRC e-STORM £359K GridNet £595K DTA Neuroinf. £6M IRCs Equator AKT DIRC Nanotechnology EU IST FP5 Projects GridStart €1.5M Enacts €0.8M Data Grid €10M Centre Projects OGSA-DAI £1.3M SunGrid £400K GridWeaver £132K Proposed Centre Projects Bridges £372K OGSA-DAI II £277K GridWeaver 2 £400K PGPGrid £312K MS.NETGrid £112K FirstDIG £ 90K NeSC Related Projects

  13. EU GridProjects • DataGrid (CERN, ..) • EuroGrid (Unicore) • DataTag (TTT…) • Astrophysical Virtual Observatory • GRIP (Globus/Unicore) • GRIA (e-Business, …) • GridLab (Cactus, …) • CrossGrid • EGSO (Solar Physics) • GridStart 45 million Euros

  14. e-Science Institute >1000 different participants From >25 countries Conferences organised Blue gene Opening by Gordon Brown Sun HPC Consortium Applications workshop Global Grid Forum 5 HPDC 11 N+N Meetings USA, San Francisco, Aug 01 China bioinf., e-SI, June ’02 USA, London, Oct ’02 China, Kunming, Jan ’03 Visitors Ian Foster Steve Tuecke Greg Riccardi Roy Williams Jim Gray (03) Alex Szalay (03) North American visits SDSC & ISI, Nov. 01 SuperComputing 01 Canarie 7, Toronto ANL, Nov 01 (OGSA), GGF5 planning NPACI Meeting Toronto, GGF4 (OGSA, DAIS & GGF5 planning) NCSA, Feb 02 ANL, Feb 02 ANL, Early Adopters, June 02 Magic meeting, Sep. 02 GGF6, Chicago SuperComputing 02, Baltimore GlobusWorld, San Diego, Jan. 03 Programme C’ttees GGF4 GGF5 HPDC11 GGF7 HPDC12 DB Chapter In Edition 2 of Grid book NeSC Internationally

  15. Grids & Data

  16. X-ologists Data Integration Distributed Data Access Scheduling Monitoring Accounting Diagnosis Logging Authorisation Data & Compute Resources Structured Data A X-informatics Grid X-informatics Application X-informatics Common High-level Infrastructure Semantic Grid Data Mining Grid Plumbing & Security Infrastructure Data Providers Data Curators

  17. Database Growth PDB protein structures

  18. £53M: 3 machines whole earth climate organs solar weather eddy resolution cells complex multiscale astroplasmas materials design oceans whole aircraft drug design protein structures nanostructures HPCx More Computation • as computer performance improves, the range of applications increases

  19. 1a. Request to Registry for sources of data about “x” SOAP/HTTP service creation API interactions Registry Factory Client XML / Relational database Grid Data Service 1b. Registry responds with Factory handle 2a. Request to Factory for access to database 2c. Factory returns handle of GDS to client 2b. Factory creates GridDataService to manage access 3a. Client queries GDS with XPath, SQL, etc 3c. Results of query returned to client as XML 3b. GDS interacts with database

  20. OGSA-DAIRelease 1 Available http://www.ogsadai.org.uk http://www.ogsa-dai.org.uk http://www.ogsa-dai.org http://www.ogsadai.org

  21. Access Grid Access Grid Nodes • Technology Developed by Rick Stevens’ group at Argonne National Laboratory • Access Grid will enable informal and formal group to group collaboration • Distributed lectures and seminars • Virtual meetings • Complex distributed grid demos • Uses MBONE and MultiCast Internet Technologies From presentation by Tony Hey

  22. Example Applications

  23. Public curateddata Shared data Glasgow Edinburgh Leicester Oxford Netherlands London Wellcome Trust: Cardiovascular Functional Genomics

  24. ScotGRID++ ~1 TIPS LHC Computing Challenge 1. CERN 1 TIPS = 25,000 SpecInt95 PC (1999) = ~15 SpecInt95 ~PBytes/sec Online System ~100 MBytes/sec Offline Farm~20 TIPS • One bunch crossing per 25 ns • 100 triggers per second • Each event is ~1 Mbyte ~100 MBytes/sec Tier 0 CERN Computer Centre >20 TIPS ~ Gbits/sec or Air Freight HPSS Tier 1 RAL Regional Centre US Regional Centre Italian Regional Centre French Regional Centre HPSS HPSS HPSS HPSS Tier 2 Tier2 Centre ~1 TIPS Tier2 Centre ~1 TIPS Tier2 Centre ~1 TIPS Tier 3 ~Gbits/sec Physicists work on analysis “channels” Each institute has ~10 physicists working on one or more channels Data for these channels should be cached by the institute server Institute ~0.25TIPS Institute Institute Institute Physics data cache 100 - 1000 Mbits/sec Tier 4 Workstations

  25. in-flight data global network eg SITA ground station airline DS&S Engine Health Center internet, e-mail, pager data centre maintenance centre global in-flight engine diagnostics Distributed Aircraft Maintenance Environment: Universities of Leeds, Oxford, Sheffield &York

  26. Comparative Functional Genomics • Large amounts of data • Highly heterogeneous • Data types • Data forms • community • Highly complex and inter-related • Volatile

  27. UCSF UIUC From Klaus Schulten, Center for Biomollecular Modeling and Bioinformatics, Urbana-Champaign

  28. Questions & Answers

More Related