what do grant reviewers really want anyway n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
What Do Grant Reviewers Really Want, Anyway? PowerPoint Presentation
Download Presentation
What Do Grant Reviewers Really Want, Anyway?

Loading in 2 Seconds...

play fullscreen
1 / 38

What Do Grant Reviewers Really Want, Anyway? - PowerPoint PPT Presentation


  • 143 Views
  • Uploaded on

7. 6. 5. 4. What Do Grant Reviewers Really Want, Anyway?. Robert Porter, Ph. D. Proposal Development Team University of Tennessee www.research.utk.edu reporter@utk.edu. VG. G. E. F. Workshop objectives.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'What Do Grant Reviewers Really Want, Anyway?' - clancy


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
what do grant reviewers really want anyway

7

6

5

4

What Do Grant Reviewers Really Want, Anyway?

Robert Porter, Ph. D.

Proposal Development Team

University of Tennessee

www.research.utk.edu

reporter@utk.edu

VG

G

E

F

slide2

Workshop objectives

  • Provide insights into the perspectives of grant reviewers- Needs and expectations - Likes and dislikes - Recommendations to grant writers
  • Help Research Administrators with proposal development

responsibilities to coach PIs more effectively

slide3

Topics

  • Overview of grant review processes at NSF and NIH - Proposal volumes

- Number of reviewers - Awards and success rates - Review processes

  • Summary of reviewer research findings - Key expectations

- Characterizations of strong proposals - Common grant writing mistakes - Lessons learned - Impact on grant writing

- Advice to proposal writers

slide4

NSF Overview

  • An independent Federal agency
  • Funds research and education in most fields

of science and engineering

  • Annual budget: ~ $7 billion;
  • Receives 55 - 65,000 proposals each year
  • Proposal success rates have leveled off

at 23-25% in recent years

slide5

NSF Reviewers (2010)

  • Number serving on panels: 15,500
  • Mail reviewers: 30,500
  • Serving on both: - 4,200
  • Total:41,800
  • 1st time reviewers : 9,400
slide7

General NSF Review Criteria

• What is the intellectual merit

of the proposed activity?

• What are the broader impacts

of the proposed activity?

• Program specific criteriamay be listed

in the program announcement

slide8

Intellectual Merit – 5 strands

1) How important is the proposed activity to advancing knowledge

and understandingwithin its own field or across different fields?*

2) How well qualifiedis the proposer to conduct the project?

  • To what extent does the proposed activity explore creative,

original, or potentially transformative concepts?

4) How well conceived and organizedis the proposed activity?

5) Is there sufficient access to necessary resources?

*Strongest emphasis in new definition

slide9

Broader Impacts – 5 strands

  • What may be the benefits of the proposed research to society?*
  • How well does the activity advance discovery and understanding while promoting teaching, training and learning?**
  • How well does the proposed activity broaden the participationof women and underrepresented groups? (“Diversity”)
  • To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networksand partnerships?
  • Will the results be disseminated broadly to enhance scientific and technological understanding?

*New emphasis in 2013

**Integration of education with research

required of all NSF proposals!

slide10

NSF: Possible rankings by reviewers

VG

G

E

F

Individual rankings:

Panel recommendation:

“HIGH PRIORITY”

“MEDIUM PRIORITY”

“LOW PRIORITY”

“DO NOT FUND”

  • “Excellent”
  • “Very Good”
  • “Good” (not good!)
  • “Fair”
  • “Poor”

Remember:

Only those proposalswith a majority of “Excellents” are likely to be funded;

PO’S have some flexibility

slide11

NSF:

Awards: 13,000

Declines: 43,000

slide12

Proposal Success Rates by Directorate:

FY2010

Average NSF success rate (2010): 23%

slide13

NIH Mission

To acquire new knowledge to help prevent, detect, diagnose, and treat disease and disability, from the rarest genetic disorder to the common cold.

slide14

A Collection of Institutes:

(DHHS > PHS > NIH)

National Cancer Institute (NCI)

National Institute of Aging (NIA)

National Institute on Drug Abuse (NIDA)

National Heart, Lung and Blood Institute (NHLBI)

National Human Genome Research Institute (NHGRI)

National Institute of Environmental Health Sciences (NIEHS)

National Institute of Allergies and Infectious Diseases (NIAID)

National Institute of Neurological Disorders and Stroke (NINDS)

National Institute of Biomedial Imaging and Bioengineering (NIBIB)

National Institute of Arthritis and Musculoskeletal and Skin Diseases (NIAMS)

National Institute on Deafness and Other Communication Disorders (NIDCD)

National Center for Complementary and Alternative Medicine (NCCAM)

National Institute of Child Health & Human Development (NICHD)

National Institute of Dental and Craniofacial Research (NIDCR)

National Institute of Alcohol Abuse and Alcoholism (NIAAA)

National Institute of General Medical Diseases (NIGMS)

National Institute of Mental Health (NIMH)

National Library of Medicine (NLM)

National Eye Institute (NEI)

(and several more!)

slide15

NIH FY 2013 Budget$31 Billion

Training 3%

Research Project

Grants 55%

$17 billion

slide16

NIH Funding Priorities

  • Number of people who have a disease
  • Number of deaths caused by a disease
  • Degree of disability produced by a disease
  • Degree to which a disease cuts short a life
  • Economic and social costs of a disease
  • Need to act rapidly to control spread of a disease

Lesson:

Cite data to quantify impact of disease

on health, society and the economy

slide19

If “GO”...

  • Read the SF424 Application Guide carefully
  • Contact the Research Office; establish working relationship
  • Be prepared to address 5 traditional NIH review criteria

- Significance: ability of project to improve health - Approach: feasibility of methods & budget - Innovation: originality of your approach* - Investigator: qualifications and experience of investigator(s) - Environment: suitability of facilities, equipment & institutional support

  • Plus the NEW criterion: IMPACT!

*NB: too much innovation can be risky

slide20

Peer Review: New Scoring System

  • Old 1 to 5 scale replaced by a 9-point scale

(1 = “Exceptional” and 9 = “Poor”)

  • Most important new score will be the final IMPACT rating:

(1 to 9)

  • Ratings will be in whole #’s only; no decimals
  • Reviewers will also provide numerical ratings for each of five traditional NIH criteria:- Significance

- Investigator(s)

- Innovation

- Approach

- Environment

7

6

5

4

slide21

7

New Scoring System, cont’d

6

5

4

  • Preliminary score: Reviewers send in their scores for the 5 present traditional criteria, plus the final IMPACT score
  • Note: Impact score is an independent rating, not an average of the 5!
  • Applications in the lower half are “less competitive,” and will Not be Discussed
  • PI’s of “ND” proposals WILL receive all scores from individual reviewers, but no overall IMPACT score
  • After discussing competitive proposals, reviewers may change their scores
  • Reviewer scores are averaged, x 10, for a range of 10 – 90
  • Average IMPACT scores are then percentiled for final ranking to determine funding order
slide22

7

New Scoring System, cont’d

6

5

4

Definition of 9 – point scale:

slide23

NIH Study Section Video

www.csr.nih.gov/

slide25

What Do Grant Reviewers Really Want, Anyway?

Profile of Reviewers Interviewed:

  • 16 Senior faculty at Virginia Tech; 10 men, 6 women
  • 12 full professors (2 Associate Deans for Research); 4 associate professors
  • Disciplines: Physical Sciences and Engineering, Social and Behavioral
  • Average number of review panels served: 10
  • Top funding agencies: NSF, NIH, USDA, DoD
  • Average awards earned in five years: 8.3 (1999 – 2004)
  • Average dollars awarded : $2.2 million
slide26

Reviewers’ motivation:

Why volunteer so much time?

Answers:

Learning the ropes in order to write better proposals“To see how the game is played” “To pick up on what reviewers like and don’t like”

Service to science“I benefitted from this process and felt I had to give back.” “This was a way I could contribute to the high quality review process at NIH.”

Keeping current: A good way to keep up with the discipline and learn about future directions.

Professional networking: Building a network of professional contact with peers and program managers at sponsor agencies

slide27

Preparation for panel meeting

  • Receive anywhere from 20 to 100 proposals (~2 weeks ahead)
  • Assigned as primary or secondary reviewer on 6 to 8
  • (written reviews required)
  • Average 35 hours reading and writing (range 15 to 60)
  • While most said they were prepared themselves,
  • writing up to the last minute not uncommon:

“We spend the first hour standing around drinking coffee

while these folks are still pecking away at their keyboards.”

slide28

Reviewer expectations at first reading:

Strong consensus:

They want to learn very quickly what the project is all about

and whether it fits program objectives.

Also look for:

Writing that is clear and concise (“concise” being most common descriptor)

Interesting, innovative ideas that will contribute to the field

Solid preliminary data showing the approach has promise

A crisp, specific project description with a well thought out research plan

Evidence that the PI is well qualified to do the research

Note: First impressions are critical:“The abstract must sell the grant. If I don’t get interested by the first page,

the proposal is lost.”

Big picture must be compelling:

“I get to the gestalt, the big picture first. If I like it, then I’ll go on to the details.

If I don’t, I’m done reading.”

slide29

What are the characteristics of a good proposal?

  • Document is neat, well organized and easy to read;
  • A good fit; shows how project will achieve program goals;
  • Provides fresh insight into an important problem;
  • Writing communicates enthusiasm and commitment;
  • Evidence the PI knows the field;
  • Convincing preliminary data;
  • Feasible work plan with appropriate budget
slide30

Proposal must speak to the reviewer:

  • To generate interest and enthusiasm to match the writer’s:
  • “You want the feeling ‘This is really great, this study has to be done.’ It’s like a fire in the belly, or knocking your socks off.
  • It makes you want to say, ‘Darn, I wish I had thought of this!”
  • Should be a good learning experience:
  • “The best proposals teach.”
slide31

Common proposal writers’ mistakes:

  • Most common: Inappropriate writing style:
  • - Vague and unfocused :“Takes me too long to figure out what they want to do”
  • - Dense academic prose : “Written like a journal paper”
  • - Verbosity: Small fonts and crowded margins
  • Other mistakes:
  • Incomplete response to the program solicitation;
  • Writer does not understand state of the art;
  • Project too ambitious, global in scope;
  • Research plan is vague (‘Trust me’ syndrome);
  • PI lacks proven competence to do the research

Special annoyance: Sloppiness, lack of proofreading!

slide32

Objectivity of review panels:

  • While disgruntled PIs sometimes accuse panels of being biased:
  • Reviewers rate objectivity highly:
  • Bias seen as “nil” or “nonexistent”
  • “System isn’t perfect, but it’s the fairest one possible.”
  • Panel dynamics are democratic and self-correcting
  • Very difficult for any one person or group to dominate
  • BUT: Occasionally there is some favoritism to senior researcher
  • based on past record (“funding on the come”)
slide33

Impact of Review Panel Experience

on Grant Writing:

Consensus: Service on review panels dramatically improves proposal writing

“You learn to put the reviewer’s hat on. You know what the panel is looking for;

you can hear their discussion in your head while you’re writing.”

“I used to write to a peer; now I write to a committee. And I make it

easy to read!

Other improved skills:

A simpler, livelier writing style

Key points laid out very early

Clear organization with frequent section headings

Use of more visual illustrations

slide34

Reviewers’ advice to proposal writers:

“Communicate with grant program officers before deciding to write; establish a relationship.”

“Study, study, study the program call.”

“Make your proposal easy to read.”

“Start much earlier than you think you have to.”

“Make sure you know what’s already been done.”

“Write in an accessible way that can be understood by a diverse group.”

“Don’t take rejection personally.”

“Get in the habit of resubmitting.”

slide35

Summing up…

  • Peer review strengths:
  • System epitomizes democratic self-determination
  • (Researchers chart their own future direction)
  • Panels are diverse, assuring a good cross section of ideas
  • Despite weaknesses, still best means to preserve
  • scientific integrity

“The research community decides its own fate

by determining what good science is.”

slide36

Summing up (cont’d)…

  • Peer review weaknesses:
  • Strongly opinionated panelist can exert undue influence
  • on group
  • “Veto” effect: Just one less than enthusiastic comment
  • by a discussion leader can doom a proposal
  • Workloads can be heavy; hard to give a fair hearing
  • to large number of proposals in single batch
  • Gender tax: Women especially pressured to participate
  • more often
  • Splitting hairs: With intensifying competition, many decisions
  • based on minor qualities
  • Incrementalism: Panels stay safe, too often shy away
  • from more daring ideas
slide37

Summing up (cont’d)…

  • Final recommendations from reviewers:
  • Allow more time for panel meetings
  • Find ways to recruit more reviewers, lighten work load
  • Allocate more money to exploratory, high risk work
slide38

Some good news…

Recently:

  • NIH has:
  • Reduced maximum proposal page lengths by half
  • Introduced more flexible scoring for New and Early Stage
  • investigators
  • Required reviewers to provide written feedback to all PIs
  • on five major review criteria
  • NSF has:
  • Placed increasing emphasis on “transformational potential”
  • in its review process
  • Implemented RAPID and EAGER awards