Out of control selecting comparison groups for analyzing nih grants and grant portfolios
Sponsored Links
This presentation is the property of its rightful owner.
1 / 20

Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios PowerPoint PPT Presentation

  • Uploaded on
  • Presentation posted in: General

Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios. American Evaluation Association Meeting Saturday November 14, 2009. Session Purpose. Explore the choices we make relating to comparison groups in a science management context

Download Presentation

Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Out of Control? Selecting Comparison Groups for Analyzing NIH Grants and Grant Portfolios

American Evaluation Association Meeting

Saturday November 14, 2009

Session Purpose

  • Explore the choices we make relating to comparison groups in a science management context

  • Examples drawn from different NIH Institutes/Centers

  • Variety of contexts

  • Focused on the methodology of comparison group selection rather than results of particular evaluations

  • Opportunity to learn about efforts to select comparison groups and to discuss the strengths and weaknesses of the various methodological choices with other evaluation experts

Session Overview

3:30 – 3:35: Introduction

3:35 – 3:50: Christie Drew: Unsolicited P01s at NIEHS (P01=multi-program project grant)

3:50 – 4:05: Jamelle Banks: NIH-Funded Research in the Context of A Scientific Field (NICHD)

4:05 – 4:20: Milton Hernandez: NIH Loan Repayment: Regression Discontinuity Analysis (OER)

4:20 – 4:35: Wesley Schultz: of Propensity Scores in a Longitudinal Science Study of Minority Biomedical Research Support (Cal State San Marcos/NIGMS)

4:35 – 5:00: Discussion

Common themes

  • Establishing comparability

  • Use of information technology

  • Compromises

  • Others….

Establishing a Comparison Set for Evaluating Unsolicited P01s at the National Institute of Environmental Health Sciences

AEA, November 14, 2009

Christie Drew – drewc@niehs.nih.gov 919-541-3319

Martha Barnes, Jerry Phelps, Pat Mastin


  • Brief overview of P01 grant mechanism and study goals

  • Finding a comparison group

    • Key Challenges

    • Approach

NIH Extramural Grant Context

  • Many different types of awards are given:

    • R: Research

    • P: Center (coordinated multi-project)

    • K: Career

    • T: Training

    • F: Fellowship

  • R01 = Research Project

    • Discrete, specified circumscribed project, performed by the named investigator(s), in specific area of expertise

  • P01 = Research Program Project

    • Broad based, multi-disciplinary, long-term, large groups under the direction of an established researcher, specific coordinated objectives, each project supports a common theme

    • Assumption = “whole” > sum of its parts

“Solicited” v. “Unsolicited”

  • Solicited = grants submitted in response to Funding Announcements or Program Announcements (specific $ set aside for funding)

  • Unsolicited = everything else. “Investigator Initiated” is a synonym

  • This analysis was focused on “unsolicited” P01s and R01s

  • Decision context: 2007 Moratorium on Unsolicited P01s, except “renewals”

Evaluation plan

  • Five Core Questions:

    • What is the overall investment in the unsolicited P01 program?

    • Are P01s able to achieve scientific outcomes that are greater than the sum of their parts?

    • Do P01s achieve synergy among sub projects and with their home institutions?

    • What are the key roadblocks/challenges inherent inP01s

    • Is there a typical “natural history” of P01s?

  • Phase 1 – Answer as many questions as possible by Dec 2008 using available data.

  • Decide how to move forward with additional phases.

Compare Unsolicited P01 to Unsolicited R01s

  • The average P01 has 3x as many projects as R01s. Are they 3x productive?

  • How do we identify “the right” P01’s to compare.

0 renewals (29)

1 Renewal (17)

2 Renewals (6)

3 Renewals (5)

4 Renewals (2)

5+ Renewals (4)

Unsolicited P01 profile at NIEHS

NIEHS P01 Science

These categories are an abstraction of the PCC Science codes – adapted from the T32 program analysis done in 2006.

Goal: Choose a Reasonable set of R01s for Comparison

Challenges (1)

  • Variation in data quality

    • IMPAC II data system improved significantly over time

    • Publication data, and especially publication data linked to grants has improved considerably in the past 5 years

    • PI track record of citing grants in publications improves over time

  • Responses

    • Narrowed our detailed analysis to 23 P01s grants active 2002-2007 (excluded the one that started in 2007)

    • Divided cumulative # of pubs by the # of years a grant had been operating

Challenges (2)

  • How to find a “scientific” match

    • Nearest Neighbor match – eSPA/Discovery Logic assisted

      • Mathematical approach “google style” context matching – focuses on unique words compared to broader set

      • If had multiple science areas in subprojects, tried to match each area

    • Vetting with Program Officers

      • Provided 5-10 potential matches; approved/disapproved each

        • Key criteria – “Would they publish in similar journals?”

      • Given overlaps in science, some R01s matched many P01s; tricky resolution required to resolve multiple matches

Challenges (3)

  • Varying lengths of P01 programs

    • Chose longer R01s when possible to ensure valid comparisons, but this is a study weakness

    • Only R01s that began before 2006 were eligible

  • Small number in the study set (23) limited the comparisons

    • Aggregated the results – compared products of 23 P01s to the products of 98 R01s (rather than a matched case-control analysis)

Summary of the Decisions

  • Narrowed the analytical set to 23 Active P01s 2002-07

  • Identified “matching R01s” using DL “nearest neighbor” approach to identify candidates, POs helped narrow/select/identify better matches.

  • Selected 98 R01s. Each P01 had at 3-5 scientific matches.

  • Analysis completed on the aggregated sets.

  • Included the Solicited P01s as another reasonable comparison set for the Unsolicited P01s.


  • Was the comparison group reasonable?

  • What would we have gained/lost by doing a – case-control analysis?

  • Are their other methods such as propensity scores or regression discontinuity analysis?

P01 Evaluation Committee Members

  • Barnes, Martha

  • Drew, Christie

  • Eckert-Tilotta, Sally

  • Gray, Kimberly

  • Lawler, Cindy

  • Loewe, Michael

  • Mastin, Pat

  • Nadadur, Srikanth

  • Kirshner, Annette

  • Phelps, Jerry

  • Puente, Molly

  • Reinlib, Leslie

    Additional Participants (Project Officers)

  • Jerry Heindel

  • Kim McAllister

  • Claudia Thompson

  • Fred Tyson


  • Thank you!

  • Login