1 / 14

PolarGrid

PolarGrid. CReSIS Lawrence Kansas February 12 2009 Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science Center and Community Grids Laboratory Indiana University Bloomington IN 47404 gcf@indiana.edu Linda Hayden (co-PI) ECSU.

bjanie
Download Presentation

PolarGrid

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. PolarGrid CReSIS Lawrence Kansas February 12 2009 Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science Centerand Community Grids Laboratory Indiana University Bloomington IN 47404 gcf@indiana.edu Linda Hayden (co-PI) ECSU

  2. Support CReSIS with Cyberinfrastructure • Base and Field Camps for Arctic and Antarctic expeditions • Training and education resources at ECSU • Collaboration Technology at ECSU • Lower-48 System at Indiana University and ECSU to support off line data analysis and large scale simulations (next stage) • Initially modest system at IU/ECSU for data analysis

  3. PolarGrid Greenland 2008 Base System (Ilulissat Airborne Radar) 8U, 64 core cluster, 48TB external fibre-channel array Laptops (one off processing and image manipulation) 2TB MyBook tertiary storage Total data acquisition 12TB (plus 2 back up copies) Satellite transceiver available if needed, but used wired network at airport used for sending data back to IU Base System (NEEM Surface Radar, Remote Deployment) 2U, 8 core system utilizing internal hard drives hot swap for data back up 4.5TB total data acquisition (plus 2 backup copies) Satellite transceiver used for sending data back to IU Laptops (one off processing and image manipulation)

  4. PolarGrid goes to Greenland

  5. NEEM 2008 Base Station

  6. PolarGrid Antarctic 2008/2009 Base System (Thwaites Glacier Surface Radar) • 2U, 8 core system utilizing internal hard drives hot swap for data back up • 11TB total data acquisition (plus 2 backup copies) • Satellite transceiver used for sending data back to IU • Laptops (one-off processing and image manipulation) IU-funded Sys-Admin • 1 admin Greenland NEEM 2008 • 1 admin Greenland 2009 (March 2009) • 1 admin Antarctica 2009/2010 (Nov 09 – Feb 2010) • Note that IU effort is a collaboration between research group and University Information Technology support groups

  7. ECSU and PolarGrid Assistant Professor, Eric Akers, and graduate student, Je’aime Powell, from ECSU travel to Greenland Initially A base camp 64-core cluster, allowing near real-time analysis of radar data by the polar field teams. An educational videoconferencing Grid to support educational activities PolarGrid Laboratory for students ECSU supports PolarGrid Cyberinfrastructure in the field

  8. PolarGrid Lab Mac OS X Public IP accessible through ECSU firewall Ubuntu Linux Windows XP Additional Software Desktop Publishing Ubuntu Linux Word Processing Web Design Programming Mathematical Applications Geographic Information Systems (GIS)

  9. Experience from Supporting Expeditions I Base processing (NEEM 2008): 600GB – 1TB on 8 cores ~8-12hours Expeditions are data collection intensive, with goal of pre-process computing data validation of daily data gathering within 24 hours Laptops utilized for one-off pre-processing, image manipulation/visualization Heavy utilization of MatLab for all processing (both pre-processing and full processing) CReSIS utilizing PolarGrid base cluster for full data processing of all data collected so far

  10. Experience from Supporting Expeditions II Lessons from field use in expeditions include the necessity of smaller computing engines due to size, weight and power limitations Greenland 2008 successes have realized PG equipment importance. CReSIS is now utilizing PG gear to store and process 2 additional radar systems data Smaller system footprint and data management has driven cost per system down. Complex storage environments are not practical in a mobile data processing environment Pre-processing data in the field has allowed validation of data acquisition during collection phases

  11. Field Results – 2008/09 “Without on-site processing enabled by POLARGRID, we would not have identified aircraft inverter-generated RFI. This capability allowed us to replace these “noisy” components with better quality inverters, incorporating CReSIS-developed shielding, to solve the problem mid-way through the field experiment.” Jakobshavn 2008 NEEM 2008 GAMBIT 2008/09

  12. TeraGrid High Performance Computing Systems 2007-2008 PSC UC/ANL PU IU NCSA NCAR 2008 (~1PF) ORNL Tennessee (504TF) LONI/LSU SDSC TACC Computational Resources (size approximate - not to scale) Slide Courtesy Tommy Minyard, TACC

  13. Future Features of PolarGrid PolarGrid will allow all of CReSIS access to TeraGrid to help large scale computing PolarGrid will support the CyberInfrastructure Center for Polar Science concept (CICPS) i.e. the national distributed collaboration to understand ice sheet science Cyberinfrastructure levels the playing field in research and learning Students and faculty can contribute based on interest and ability – not on affiliation PolarGrid will be configured as a cloud for ease of use – virtual machine technology especially helpful for education PolarGrid portal will use Web2.0 style tools to support collaboration ECSU can use PolarGrid to enhance local facilities and Internet2 connectivity

More Related