1 / 14

Impact Requirements Analysis Team

Impact Requirements Analysis Team. Final Report in the Forum: RATS: Science Impact and Metrics. Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb (ORNL) Dave Hart (SDSC) Lex Lane (NCSA) Scott Lathrop (UC/ANL) Sergiu Sanielevici (PSC)

asha
Download Presentation

Impact Requirements Analysis Team

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impact Requirements Analysis Team Final Report in the Forum: RATS: Science Impact and Metrics • Co-Chairs: • Mark Sheddon (SDSC) • Ann Zimmerman (University of Michigan) • Members: • John Cobb (ORNL) • Dave Hart (SDSC) • Lex Lane (NCSA) • Scott Lathrop (UC/ANL) • Sergiu Sanielevici (PSC) • Kevin Walsh (SDSC) • GDO Supervisor: • Dane Skow (ANL, Deputy Director) mi-rat@teragrid.org

  2. Impact Requirements Analysis Team • Purpose: Investigate and recommend measures to assess the short- , mid- , and long-term effects of the TG’s capabilities and resources on scientific discovery

  3. Guiding Questions • What impact has the TeraGrid had on the practice of scientific research? • What impact has the TeraGrid had on the production of scientific knowledge?

  4. Guiding Principles • Strike a balance between: • the usefulness of the potential approaches, • the effort required from TG users and reviewers to provide data, and • the number and type of TG personnel necessary to collect, manage, and analyze data • Consider all aspects of TG, including people and non-compute and compute resources • Consider concerns related to data privacy and confidentiality

  5. Summary of (short-medium term) Impact RAT Recommendations • #1: Modify the Partnerships Online Proposal System (POPS) to make it more useful and mineable • #2: Create a “nugget” database • #3: Instrument (compute and) non-compute resources for usage data collection • #4: Categorize publications • #5: Look deeper into the user community • #6 Continue doing an annual user survey to gain direct feedback • #7 Learn from others

  6. #1: Modify POPS to make it more useful/mineable • Standardize the collection of existing and new data gathered from users • Examples: Standardize PI name as last name, first name; select funding agency from a stored list; specify structure to proposal attachments • Improve qualitative impact information requested from users during renewal process • Examples: Why is the computational part hard? How did TeraGrid help you accomplish this part? • Consider requesting standardized impact-related information from reviewers • Examples: a) Type of research (e.g. incremental; high-risk, high-payoff, etc.); b) Numeric rating of impact quality

  7. #2: Create a “nugget” database • Our current collection method is ad-hoc • NSF would like us to improve on our nugget submittals • NSF has their own • Many could contribute • Would not have to be only science successes (e.g., “practice” successes) • Components of a good nugget (Guy Almes) • Why is this science important? • Why is the computational/cyberinfrastructure part hard? • What did the TeraGrid do to help accomplish the computational/cyberinfrastructure part?

  8. #3: Instrument (compute and) non-compute resources for usage data collection • Particularly those related to the “grid” part of TG • Cross-site runs • Grid Middleware • Global File Systems • Data Collections • We have lots of SU related now

  9. #4: Categorize publications • Recommend additional analysis of the POPS publication list • Categorize citations according to journal (as applicable), discipline, “ranking,” and add the POPS proposal # associated with the publication. • Provides greater detail on publication impact by showing quality of journal, etc • Including the POPS proposal number will provide a means to tie publications to the TG resources and capabilities used and reviewer input.

  10. #5: Look deeper into the user community • Improve the usage database so that it is possible to examine trends among “non-standard” users, such as: • Social sciences • Minority Serving Institutions • For all users, track by: • Institution and type of institution (e.g., 2-year, 4-year, MSI) • Type of user (e.g., race, gender, and status) • History of allocations received • Over time, these data would be useful to help discern: • Whether education, outreach, and training programs are having and impact. • How usage changes over time • Whether users continue to use TG (would be helpful in gaining an understanding of why users “leave”).

  11. #6 Continue doing an annual user survey to gain direct feedback • A brief, focused survey minimizes the burden on users. • Coordinating random samples among different surveys reduces the chance that the same users will be solicited more than once. • TeraGrid should follow these and other guidelines to improve the reliability and validity of the surveys. • In 2006, TG is doing this by participating in a survey being conducted by the University of Michigan evaluation team. • Smaller surveys directed toward particular audiences or topics should also be considered. • For example, pre- and post-surveys of researchers that benefit from ASTA support could be very informative.

  12. #7 Learn from others • We should share what we’ve learned and monitor what others are doing. • Share this report with a broad range of individuals and institutions to gain their feedback. • DOD and DOE, Science Gateways, representative users, NSF officials, and experts in the measurement of science and technology impacts. • Hold a workshop.

  13. Longer-term possibilities • Social organization of research, economic impact, users and usage of hpc resources, etc. • Potential methods: Network analysis, other forms of peer review, ongoing interviews and focus groups, and historical case studies

  14. Impact Requirements Analysis Team Final Report in the Forum: RATS: Science Impact and Metrics • Co-Chairs: • Mark Sheddon (SDSC) • Ann Zimmerman (University of Michigan) • Members: • John Cobb (ORNL) • Dave Hart (SDSC) • Lex Lane (NCSA) • Scott Lathrop (UC/ANL) • Sergiu Sanielevici (PSC) • Kevin Walsh (SDSC) • GDO Supervisor: • Dane Skow (ANL, Deputy Director) mi-rat@teragrid.org

More Related