Teragrid institute allocation policies and best practices
This presentation is the property of its rightful owner.
Sponsored Links
1 / 24

TeraGrid Institute: Allocation Policies and Best Practices PowerPoint PPT Presentation


  • 75 Views
  • Uploaded on
  • Presentation posted in: General

TeraGrid Institute: Allocation Policies and Best Practices. David L. Hart, SDSC [email protected] June 4, 2007. The Basics. Who What When Where Why How. ?. The Lingo. DAC Development Allocation Committee MRAC Medium Resource Allocations Committee LRAC

Download Presentation

TeraGrid Institute: Allocation Policies and Best Practices

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Teragrid institute allocation policies and best practices

TeraGrid Institute: Allocation Policies and Best Practices

David L. Hart, SDSC

[email protected]

June 4, 2007


The basics

The Basics

  • Who

  • What

  • When

  • Where

  • Why

  • How

?


The lingo

The Lingo

  • DAC

    • Development Allocation Committee

  • MRAC

    • Medium Resource Allocations Committee

  • LRAC

    • Large Resource Allocations Committee

  • POPS

    • Partnerships Online Proposal System

  • Roaming

    • TeraGrid (Wide) Roaming (Access)

  • SU

    • Service Unit


The process getting started

The Process: Getting Started

  • Start-up Allocations (DAC Awards)

    • Accepted, reviewed, awarded on a continual basis

    • Up to 30,000 SUs of TeraGrid Roaming

    • Best for

      • Code development, porting, testing

      • Gathering performance data for MRAC/LRAC proposals

      • Classroom instruction

      • Small-scale computational needs

10 Minutes to an Allocation

  • Go to POPS

  • Create a POPS user ID.

  • Login.

  • Select “New” proposal type.

  • Select “0-30,000.”

  • Click on “DAC-TeraGrid.”

  • Fill out PI Info, Proposal Info, and Resource Request screens.

  • Upload PI’s CV.

  • Press “Final Submission.”


The process going mrac or lrac

PIs need to be aware of the lead time for getting an MRAC or LRAC award

Requires a written proposal

Reviewed by domain experts

LRAC

More than 500,000 SUs

Reviewed semi-annually

Awards begin April 1, Oct. 1

The Process: Going MRAC (or LRAC)

  • MRAC

    • Limit: 500,000 SUs

    • Reviewed quarterly

    • Awards begin Jan. 1, April 1, July 1, Oct. 1


The awards

The Awards

  • One per PI

  • Allocations made for 12-month periods

  • Unused SUs are forfeited at the end of an award period

  • Add users to a grant via TeraGrid User Portal

  • Progress report required annually as part of renewal proposals and multi-year awards


The options

The Options

  • Asking for Help

    • [email protected]

  • Multi-year Awards

    • Possible, but not recommended for new PIs

    • Only Progress Reports required in subsequent years

  • Justifications

    • To address reviewer concerns and get more of the requested SUs

    • Best for specific omissions (not to salvage horrible proposals)

  • Supplements

    • Request additional SUs during a 12-month allocation period

    • Not for DACs! Reviewed by MRAC/LRAC members.

  • Extensions

    • Can extend award period an additional 6 months for cause

    • No additional SUs!

  • Advances

    • UP to 10% of MRAC/LRAC request can be provided in advance


The resources compute

The Resources: Compute

  • Compute

    • Also Visualization

    • TeraGrid Resources Catalog

    • Can request specific resource(s) or TeraGrid Roaming

      • Except TeraGrid DACs, which are roaming only

    • Requests made in SUs

SDSC’s Blue Gene


The resources storage

The Resources: Storage

  • Long-term disk and tape

  • Policies evolving, but some already available for award

    • Indiana HPSS Archive

    • SDSC Database

    • SDSC Collections Disk Space

    • TeraGrid GPFS-WAN

  • Look for announcements in this area soon…


The resources advanced support

The Resources: Advanced Support

  • NEW!

  • Dedicated TeraGrid staff assistance

  • Limited resources

  • MRAC/LRAC reviewers rate possible projects

  • Extra info required for proposals

http://www.teragrid.org/userinfo/asp.php


The proposal pops

The Proposal: POPS

  • Straightforward (mostly)

    • Once you get to the Web-based data entry forms

  • Latest changes

    • Supporting grant information

  • Coming soon

    • Better TeraGrid integration

https://pops-submit.ci-partnership.org/


The proposal proposal document s

The Proposal: Proposal Document(s)

  • The real key to a successful review

  • There are page limits!

  • Sample proposals online

  • But now, some tips and advice…

http://www.ci-partnership.org/Allocations/


Traditional v community

“Traditional” v. Community

  • MRAC/LRAC proposals are accepted in four general categories of research activities

    • Individual investigator

    • Research collaborations (e.g., MILC consortium)

    • Community Projects (e.g., NEES)

    • Community Services (e.g., ROBETTA, Gateways)

  • The general requirements for proposals of all four types remain largely the same.


Proposal review criteria

Proposal Review Criteria

  • Computational Methodology

    • The choice of applications, methods, algorithms and techniques to be employed to accomplish the stated objectives should be reasonably justified. While the accomplishment of the stated objectives in support of the science is important, it is incumbent on proposers to consider the methods available to them and to use that which is best suited.

  • Appropriate Use of Resources

    • The resources chosen must be an appropriate match for the applications and methodologies to be used and must be in accordance with the recommended use guidelines for those resources

  • Efficient Use of Resources

    • The resources selected must be used as efficiently as is reasonably possible. To meet this criterion, performance and parallel scaling data should be provided for all applications to be used along with a discussion of optimization and/or parallelization work to be done to improve the applications.

  • http://www.ci-partnership.org/Allocations/allocationspolicy.html


Additional review considerations

Additional Review Considerations

  • Prior progress

    • From prior year allocation, DAC award, or work done locally

  • Ability to complete the work plan described(more significant for larger requests)

    • Sufficient merit-reviewed funding

    • Staff, both number and experience

  • Local computing environment

  • Other access to HPC resources

    • (e.g., Campus centers, DOE centers)


General proposal outline

General Proposal Outline

  • Research Objectives

  • Codes and methods to be used

  • Computational plan

  • Justification for SUs (TB-yrs) requested

  • Additional considerations

    Note: Sections III and IV are often integrated.


I research objectives

I. Research Objectives

  • Traditional proposals

    • Describe the research activities to be pursued

  • Community proposals

    • Describe the classes of research activities that the proposed effort will support.

  • Keep it short: You only need enough detail to support the methods and computational plan being proposed.

  • TIP—Reviewers don’t want to read the proposal you submitted to NSF/NIH/etc, but they will notice whether you have merit-reviewed funding.


Ii codes data and methods

II. Codes (Data) and Methods

  • Very similar between traditional and community proposals.

  • More significant if using ‘home-grown’ codes.

  • Provide performance and scaling details on problems and test cases similar to those being pursued.

  • Ideally, provide performance and scaling data collected by you for the specific resource(s) you are requesting


Iii computational plan

III. Computational Plan

  • Traditional proposals

    • Explicitly describe the problem cases you will examine

      • BAD: “…a dozen or so important proteins under various conditions…”

      • GOOD: “…7 proteins [listed here; include scientific importance of these selections somewhere, too]. Each protein will require [X] number of runs, varying 3 parameters [listed here] [in very specific and scientifically meaningful ways]…”

  • Community proposals

    • Explicitly describe the typical use-case(s) that the gateway supports and the type of runs that you expect users to make

    • Describe how you will help ensure that the community will make scientifically meaningful runs (if applicable)

      • BAD: “…the gateway lets users run NAMD on TeraGrid resources…”

      • BETTER: “…users will run NAMD jobs on [biological systems like this]…”

      • BETTER STILL: “…the gateway allows users to run NAMD jobs on up to 128 processors on problem sizes limited [in some fashion]…”


Iv justification of sus tb yrs

IV. Justification of SUs (TB-yrs)

  • Traditional proposals

    • If you’ve done sections II and III well, this section should be a straightforward math problem

    • For each research problem, calculate the SUs required based on runs defined in III and the timings in section II, broken out appropriately by resource

      • Reasonable scaling estimates from test-case timing runs to full-scale production runs are acceptable.

    • Clear presentation here will allow reviewers to cut time or storage in a rational fashion


Iv justification of sus tb yrs1

IV. Justification of SUs (TB-yrs)

  • Community proposals

    • The first big trick: Calculating SUs when you don’t know the precise runs to be made a priori.

    • In Year 2 and beyond

      • Start with an estimate of total usage based on prior year’s usage patterns and estimate for coming year’s usage patterns (justify in Section V).

      • From this information, along with data from sections II and III, you can come up with a tabulation of SU estimates.

    • Year 1 requires bootstrapping

      • Pick conservative values (and justify them) for the size of the community and runs to be made, and calculate SUs.

      • TIP—Start modestly. If you have ~0 users, don’t expect the reviewers to believe that you will get thousands (or even hundreds) next year.


V additional considerations

V. Additional Considerations

  • For traditional proposals, these are not controversial

    • Local computing environment

    • Other supercomputing resources

    • Prior Progress

    • Experience/staffing


V additional considerations1

V. Additional Considerations

  • For community proposals, these components can provide key details:

    • Community Support and Management Plan

      • Describe the gateway interface — in terms of how it helps community burn SUs.

      • Describe plans for growing the user community, “graduating” users to MRAC awards, regulating “gateway hogs”

    • Progress report

      • The actual user community and usage patterns

      • Manuscripts produced, thanks to this service.

    • Local computing environment

    • Other HPC resources


Questions

Questions?

http://teragrid.org/userinfo/access/accounts.php

[email protected]

[email protected]


  • Login