1 / 17

GENI Science Shakedown

GENI Science Shakedown. Paul Ruth RENCI / UNC Chapel Hill pruth@renci.org. Motivation. Motivation. Are the current GENI technologies capable of effectively supporting the computing needs of domain science? . Motivation.

conway
Download Presentation

GENI Science Shakedown

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GENI Science Shakedown Paul Ruth RENCI / UNC Chapel Hill pruth@renci.org

  2. Motivation

  3. Motivation Are the current GENI technologies capable of effectively supporting the computing needs of domain science?

  4. Motivation Are the current GENI technologies capable of effectively supporting the computing needs of domain science? What are the benefits and limitations of using GENI for domain science?

  5. Observations

  6. Observations GENI has matured to the point where it is now time to evaluate its capabilities with respect to domain science applications “at scale” and use the results of this evaluation to inform the direction of future GENI development.

  7. Observations GENI has matured to the point where it is now time to evaluate its capabilities with respect to domain science applications “at scale” and use the results of this evaluation to inform the direction of future GENI development. There is an opportunity to introduce domain scientists to the GENI-way-of-thinking about networking and distributed systems in order to seed the dissemination of GENI technologies to existing and future high throughput and high-performance computing centers

  8. Personnel • Computer Scientists: • Paul Ruth, pruth@renci.org • AnirbanMandal, anirban@renci.org • Domain Scientists: • Jeffery L. Tilson, jtilson@renci.org • Brian Blanton, bblanton@renci.org

  9. Project Goals • Apply the GENI experiment workflow to domain science applications and evaluate GENI’s ability to run domain science experiments • Performance • Ease-of-use. • Provide feedback to GENI rack developers on the current capabilities with respect to science applications as well as target areas for improvement. • Build a tool for domain scientists to create RSpec from high-level descriptions of high-throughput and high-performance applications.

  10. Project Goals • Apply the GENI experiment workflow to domain science applications and evaluate GENI’s ability to run domain science experiments (performance and ease-of-use). • Build a tool for domain scientists to create RSpec from high-level descriptions of high-throughput and high-performance applications. • Provide feedback to GENI rack developers on the current capabilities with respect to science applications as well as target areas for improvement. Bonus Goal: Attract domain scientists to GENI

  11. Domain Science Applications • MotifNetwork • Jeffery L. Tilson (RENCI) • High Throughput Computational Genomics • Identification of functional domains and the identification of conserved functional relationships across large numbers of genomes. • Workflow ensemble (Pegasus/HTCondor) • Task data sets on the order of 50-100 GB. • Implicit iteration • Scales to 1000s of tasks

  12. Domain Science Applications • ADCIRC • Brian Blanton (RENCI) • Storm surge and tide model • Finite element model • MPI tightly coupled • Urgent Computing • Approved by FEMA for computing storm surge flood hazard simulations • Used for Digital Flood Insurance Rate Maps (DFIRMs) • Scales to 10000+ MPI processes

  13. Experiments • Provisioning overhead • creating, destroying, and modifying slices. • GENI for big-data. • Large workflow inputs/output • GENI for MPI. • Tightly coupled

  14. Outreach to Domain Science • Tool for deploying domain science applications easily. • RSpec generation of a Pegasus/HTCondor cluster and MPI cluster. • Extendable to other science slice types. • Introducing domain scientists to GENI. • Feedback from domain scientists to inform future GENI development

  15. Final thoughts on what we will find • Most problems will be on the edges • Disk I/O • Local disk • Network storage • Network virtualization mechanism • Tradeoffs of: virtio, OVS, linux bridges, SR-IOV • More overhead than we realize • Performance isolation • Something unexpected

  16. First year goals • GEC 19 • Applications working on InstaGENI and ExoGENI • Initial performance profile • GEC 20 • Deeper performance profile • Initial suggestions for rack developers

  17. ThanksQuestions?

More Related