1 / 11

ARC Tips and Tricks

ARC Tips and Tricks. Professor Andrew Cheetham PVC – Research & Information Management University of Canberra. Professor Cheetham would like to acknowledge input and advice from the ARC in compiling this presentation. In particular discussions and contributions from Professor Lawrence Cram.

colm
Download Presentation

ARC Tips and Tricks

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ARC Tips and Tricks Professor Andrew Cheetham PVC – Research & Information Management University of Canberra Professor Cheetham would like to acknowledge input and advice from the ARC in compiling this presentation. In particular discussions and contributions from Professor Lawrence Cram.

  2. The Agenda • Some History… • The Assessment Criteria • The Scoring System • The Assessment Weightings • Features of Top Ranked applications • Features of Low Ranked applications • How to interpret the comments • Some tips…

  3. Discovery Projects 2000-2007

  4. Linkage Projects 2001-2006 Typical Success rate is about 50%

  5. The Assessment Text Criteria • Assessors are asked to use the following Text Criteria • OutstandingOf the highest merit, at the forefront of international research in the field. Fewer than 2% of applications should lie in in this band. • Excellent. Strongly competitive at international levels. Fewer than 20% of applications should lie in this band. • Very good. An interesting, sound, compelling proposal. Approximately 30% of applications will have a score in this band. • Good. A sound research proposal that lacks a compelling element in some respect. Approximately 30% of applications are likely to fall into this band. • Fair. The proposal has potential, but requires significant development to be supportable. Up to 20% of applications are likely to lie in this band or the next lower one. • Flawed. The proposal has one or more fatal flaws.

  6. Scores Vs Distribution: Score Outstanding 100 90 Excellent Successful (Discovery) 85 80% Very Good Successful (Linkage) 80 50% Good 75 20% Fair 70 0% F L A W E D 0 Note the important difference between the numerical scores and the distribution of the Text Criteria

  7. Score weights:

  8. Top-ranked applications • There are several features that Top Ranked Applications have in common… • Manage to balance technicality and accessibility • Present problems and/or controversies and explain how they will solve them • Explain how the momentum of the subject demands funding now • Show how Australian work fits into the international picture • Set their work in the context of the National Priority Framework • Back up compelling claims with evidence and others' judgments • Carefully temper ambitious goals with plausible approaches • Display evidence of responsible but often daring approaches to the problem • Chief Investigatorshave demonstrable evidence of strong international track records • Present excellent progress reports on previous grants

  9. Low-ranked applications • Not surprisingly, Low Ranked Applications also have features in common… • Use too much technical jargon. • Make grandiose and implausible claims about outcomes. • Don't support claims of excellence or progress with evidence. • Relate to "backwater" research with no momentum. • Are weakly linked into national and international research networks • Tend to emphasise the collection of data rather than the solution of controversies • Set a negative or depressive tone about the state of the subject in Australia • Contain a high rate of spelling, grammatical and technical errors. • Often have unedited nonsense in the text (it is sometimes unclear whether this is inadvertent)

  10. How to read an assessment • Assessors tend to be “kind” and “circumspect” in their criticism. • The words “good” and “very good” are not superior to “outstanding” and “excellent”. • The absence of “outstanding” and “excellent”MAY be code for “good but not among the best” • Look on assessor text as “bad news” rather than “good news”– it is rare for adverse criticisms to not be coded in the text, and the EAC will be looking for evidence of this harmony between rank and text. • Don’t assume that bland positive statements are favourable to you – other applications are probably getting HOT POSITIVE comments • Remember that EAC members see all the assessment texts together: • “Risky and innovative” by itself might be good news, but if another assessor says “improbably ambitious” the two texts reinforce a possible weakness. • “Adds important knowledge” may be positive, but maybe not if another assessor says “only limited value since this is a rather pedestrian extension of earlier work by the applicant”

  11. Some ARC advice: • Start early: • Discovery • at least 9 months before the deadline • Have the grant complete 2 months before for peer review and polishing • Linkage • Building relationships with possible industry partners should begin 18 months before the deadline • No track record? (40% of Discovery Score, 25% of Linkage) • forget it, get funds from elsewhere to build your track record • The current Australian Competitive Grant register is over three pages long • Partner only mildy interested? (25% of Linkage score) • forget it, the readers will notice… build the relationship • Transparency and clarity of the application is everything

More Related