1 / 20

The Global Coastal Ocean Modelling System

The Global Coastal Ocean Modelling System. Jason Holt , James Harle, Sarah Wakelin, Sylvain Michel, Roger Proctor POL Mike Ashworth, Stephen Pickles, Crispian Batstone STFC Keith Haines, Dan Bretherton, Jon Blower ESSC

fruma
Download Presentation

The Global Coastal Ocean Modelling System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Global Coastal Ocean Modelling System Jason Holt, James Harle, Sarah Wakelin, Sylvain Michel, Roger Proctor POL Mike Ashworth, Stephen Pickles, Crispian Batstone STFC Keith Haines, Dan Bretherton, Jon Blower ESSC Icarus Allen, Rob Holmes, Jerry Blackford, Katy Lewis PML

  2. The GCOMS mission: Jan’06-Oct’09 • Develop a system to automatically configure and run regional shelf-sea models for any/all coastal regions around the world • Motivation: Shelf sea’s Role in Global Biogeochemical Cycles Down-scaling Climate Change Impacts Satellite estimate of primary production e.g. fish catch: Watson, R. and D. Pauly 2001. Nature 414, 534-536

  3. The Global Coastal Ocean Modelling system • A practical solution available now: • Split the coastal seas into several domains • Deploy an automatically configured regional model in each: ~7km resolution, 40 levels • Run each independently one-way nested in a global GCM • Splits the problem into computationally tractable parts • Need 7-70 times the computational resource of the global ocean 42 domains See Holt et al 2009 Modelling the Global Coastal Ocean Phil Trans Roy. Soc. A

  4. POLCOMS-ERSEM: a model for the Global Coastal Ocean • Technically well developed for the task: • Dynamic memory • Run-time domain decomposition • Arbitrary open boundary shape • Runs on all computer platforms • Tested in Operational Oceanography • www.metoffice.gov.uk/reasearch/ncof/mrcs/browser.html Holt and Proctor JGR 2008 European Regional Seas Ecosystem Model

  5. The GCOMS System Validation data sources 22,000 cores 2,000 cores • Job control: • G-Rex • SSHFS • Shell scripts Global Forcing Archive 300 cores Visualisation,Validation, Post-Processing Matlab , Perl, SSHFS Compute resources Users and collaborators Domain Generator Matlab

  6. Modes of execution Isolated domains Cluster Grid MPI job MPI job Sequential Domains b.c.’s Cluster Grid MPI job MPI job Ensembles HPC MPI job MPI job Isolated Communicating

  7. Domain Coupling and Cross-site runs using MPIg • Experiments using MPIg on the National Grid Service, using 2 domains of similar size • On a single host, the overhead of coupling domains is small • The overhead of cross-site runs (domains are placed on different hosts and communicate using MPIg) is significant, but not preclusive • Manchester • Leeds In practice the number of cores to run both domains are readily available on the same machine

  8. Monolithic Ensemble execution 208 processors HPCx Domain size 1 month

  9. Load balancing by number of sea points 128 processors 256 processors • 516 and beyond – i/o reduces load balance (computation is ~82% balanced) • Some component scale with total number of points • In practice package domains into groups of 4-6

  10. A POLCOMS G-Rex service POL Remote cluster G-Rex’s firewall port open to POL Port 9092 G-Rex client Input and output via G-Rex (HTTP) G-Rex server POLCOMS launch scripts and forcing data (same every run)‏ POLCOMS model setup, including source code, work-flow scripts, input data and output from all runs

  11. QUEST-FISH: A real example of GCOMS application “How climate change would affect the potential production for global fisheries resources in the future, compared to past and present scenarios, in the absence of exploitation?” NORD03 BERS15 NEBS43 NEFL07 KURO24 NWAM13 CANA30 BBEN22 INDO23 GGUI28 BENG27 HUMB10 12 model domains cover 20 Large Marine Ecosystems and >60% of global fish catch About 400M people get >50% of their animal protein from fish (FAO 2008)

  12. Impacts of climate change on global fisheries • One Climate model and one emissions scenario • IPSL-CM4 with SRES A1B Time slices: 3yr spin-up+ 10 year production • Pre-industrial: 1864-1873 • Present day: 1992-2001 + Reanalysis forced • Near Future: 2036-2044 • Far Future: 2086-2094 That’s 780 domain years of integration! But have a high degree of parallelism e.g. could run on ~13k processors/cores in about 10 days Runs are on going…..

  13. Status of simulations • PI - Pre-Industrial 1861-1873 • PD - Present Day • 47 - SRESA1B 2047-2059 • 82 - SRESA1B 2082-2094 • N – re-analysis 1989-2001 • These runs completed • in ~6 months • by 1.5 people including basic validation: SST, net PP. • using ~1.3M cpu hrs on a CRAY XT4

  14. Re-analysis forced runs 1992-2001: mean PP May not be the best model of each region but they are the same model

  15. North West Africa: Validation Sea Surface Temperature Mean RMS Error Mean Bias Error

  16. Differences in net PP: Far-Future versus Pre-Industrial gCm-2yr-1

  17. Key advantages of the GCOMS system • Consistent regional inter-comparisons • Ideal for forcing/parameter ensembles • Flexibly adapting to changing computational landscape: massively parallel, multi-core systems • Operable by a small number of researchers • ‘hand made’ regional models usually have 0.5-1 person each (or whole teams). • But ability to “tune” for each domain is limited

  18. Limitations of the System • Limited sea-ice capability • Only polar-coordinates • Limited ecosystem b.c.’s • Only 1-way nesting • Trade off between • Resolution/process representation improving simulations • Boundary conditions degrading it • Not as automated or flexible as we would like • Domain definition algorithm needs refining • I/O performance bottle neck • Data handling needs work NEMO-shelf

  19. Next steps for GCOMS The CAPRI proposal: Linking climate and socio-economic change to coastal flooding and fisheries NERC consortium bid

More Related