1 / 79

NSERC DG Advices

NSERC DG Advices. yann-gael.gueheneuc@polytmtl.ca Version 0.5 2013/07/07. Questions: I welcome them all at yann-gael.gueheneuc@polymtl.ca. Disclaimer: I cannot be held responsible for the failure or success of your applications, were you to follow or not these advices. NSERC DG Advices.

Audrey
Download Presentation

NSERC DG Advices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. NSERCDG Advices yann-gael.gueheneuc@polytmtl.ca Version 0.5 2013/07/07

  2. Questions: I welcome them all at yann-gael.gueheneuc@polymtl.ca

  3. Disclaimer: I cannot be held responsible for the failure or success of your applications, were you to follow or not these advices

  4. NSERC DG Advices • Each NSERC DG application is evaluated according to 4 criteria and 6 merit indicators • Excellence of the researcher • Merit of the proposal • Contribution to the training HQP • Cost of research

  5. NSERC DG Advices • Each NSERC DG application is evaluated according to 4 criteria and 6 merit indicators • Exceptional • Outstanding • Very strong • Strong • Moderate • Insufficient

  6. NSERC DG Advices • How are these criteria rated by the reviewers using the indicators? • How to ease the reviewers’ jobs? And, how to be possibly more successful?

  7. Process in a nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP Form 100 Form 101 Conclusion Further readings Outline

  8. Process in a nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP Form 100 Form 101 Conclusion Further readings Outline

  9. Process in a Nutshell • From the candidate’s point of view • August 1st: submission of Form 180 • November 1st: final submission of Forms 100, 101, and publications • March/April: announcement of the results

  10. Process in a Nutshell • From the internal reviewer’s point of view • Two main parts • Off-line work, e-mails/readings • Competition week in Ottawa

  11. Process in a Nutshell • Off-line work • August 27th: reception of all the submissions • In 2012, 322 submissions • September 7th: ratings (expertise levels and conflicts) of all the submissions • High, Medium, Low, Very Low, Conflict, X (language) • September 24th: final choice of the 1st internal reviewers for each applications • In 2012, 14 applications as 1st internal reviewer, 15 as 2nd internal reviewer, 17 as reader = 46

  12. Process in a Nutshell • Off-line work • October 5th: choice by the 1st internal reviewer of 5 external referees • In 2012, 14 applications = 70 • May include referees suggested by the candidate but may also replace all of them • October 22nd: ratings of applications from other evaluation groups • In 2012, 1 application

  13. Process in a Nutshell • Off-line work • Early December: final list of readings • In 2012, 47 applications • January/February: reception of tge reports from the external referees • In 2012, 123 reports • February 18th to 22nd: competition week in Ottawa during which each application is discussed and rated

  14. Make it easier for the reviewers Process in a Nutshell • Off-line work • In 2012 (and I suspect every year), a lot of work! • 322 submissions • 47 applications (including joint publications) • 70 referees • 123 referee reports

  15. Process in a Nutshell • Competition week • February 18th to 22nd: competition week in Ottawa during which each application is discussed and rated • 5 days • In 2012 (and I suspect every year), very intense, demanding, and tiring

  16. Process in a Nutshell • Competition day • Starts at 8:30am • Divides into • 31 15-minute slots • 2 15-minute breaks • 1 45-minute lunch • Ends at 5:15pm • If no deferred applications to re-discuss • In 2012, 1 application

  17. Process in a Nutshell • Competition slot • In a 15-minute slot, the ratings of an application are chosen by the reviewers • Or the application is “deferred”, to be re-discussed at the end of the day

  18. Process in a Nutshell • Competition slot • 1st internal reviewer gives ratings with justifications, which must be facts in the Forms • 2nd internal reviewers contrasts, supports, adds missing facts from the Forms • The readers complement or challenge ratings given by 1st and 2nd internal reviewers, must be supported by facts from the Forms

  19. Not exactly the NSERC criteria Process in a Nutshell • Competition slot • 1st internal reviewer gives ratings with justifications, which must be facts in the Forms • In 2012, a typical presentation follow this pattern • Candidate: career, funding, visibility, publications, HPQ record, planned training • Proposal: context, lacks, characteristics (Incremental? Applicable? Feasible?) • External: summary of the referees' reviews, summary of the provided contributions then, the reviewer would give his ratings

  20. Process in a Nutshell • Competition slot • Session chair keeps the time strictly • Session chairs makes sure that any discussion sticks to the facts

  21. Process in a Nutshell • Competition slot • Ratings are anonymous • Secret electronic vote • Session chair announce results • Ratings are consensual • If reviewers/readers strongly disagree, the application will be re-discussed at the end of the day • In 2012, I did not see any strong debates: mostly 1st and 2nd internal reviewers agreed, backed-up by the readers • In 2012, some facts were sometimes highlighted and ratings were changed accordingly

  22. Process in a Nutshell • Competition slot • Any criteria rated as moderate or insufficient receive comments from the committee, reflecting the consensus of the reviewers (highly focused) • In 2012, NSERC provided typical comments, for example: “The applicant did not take advantage of the available space in Form 100 to make a compelling case about his/her most significant research contributions. Given the lack of information, the EG was unable to carry out a thorough assessment and potentially recommend a higher rating.”

  23. Process in a Nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP Form 100 Form 101 Conclusion Further readings Outline

  24. Funding Decisions • In a nutshell • Each proposal is rated by the reviewers secretly after the discussions • The medians of the ratings are used for criteria • For example • Excellence of researcher: {S, S, M, M, M}, rating is M • Merit of the proposal: {V, V, S, S, M}, rating is S • Impact of HQP: {V, S, S, S, M}, rating is S • The application rating is therefore {M, S, S}

  25. Funding Decisions • Bins • The numeric “values” of the ratings are “added” • For example, {M, S, S}  2+3+3 = 8 • The application is placed into one of 16 bins • The bins are labelled A through to P and correspond numerically to 18 down to 3

  26. Funding Decisions • Bins • Bins A and P are uniquely mapped to {E, E, E} and {I, I, I} while pther bins contain a mix of numerically equivalent ratings, e.g., {V, S, M} is in the same bin as {S, S, S} and {M, S, V} • For example, the application rated {M, S, S} is in K • Not all applications in a bin are funded: {S, S, S} may be funded while {M, S, V} is not • Because of the moderate indicator for the first criteria • Cut-off point depends on year

  27. Funding Decisions • ER vs. ECR • Candidates are divided into • ER: established researchers, who already applied (funded?) to NSERC DG • ECR: early-career researchers, who apply to NSERC DG for the first time • ECR are funded one bin “lower” (better) than ER

  28. Process in a Nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP Form 100 Form 101 Conclusion Further readings Outline

  29. Criteria and Indicators • “Values” of criteria • Excellence of the researcher • Merit of the proposal • Contribution to the training HQP • Cost of research

  30. Criteria and Indicators • “Values” of criteria • Excellence of the researcher • Knowledge, expertise and experience • Quality of contributions to, and impact on, research areas in the NSE • Importance of contributions

  31. Not really important Amounts of previous grants (in particular NSERC DG) should be ignored Criteria and Indicators • “Values” of criteria • Merit of the proposal • Originality and innovation • Significance and expected contributions to research; potential for technological impact • Clarity and scope of objectives • Methodology and feasibility • Extent to which the scope of the proposal addresses all relevant issues • Appropriateness of, and justification for, the budget • Relationship to other sources of funds

  32. Criteria and Indicators • “Values” of criteria • Contribution to the training HQP • Quality and impact of contributions • Appropriateness of the proposal for the training of HQP in the NSE • Enhancement of training arising from a collaborative or interdisciplinary environment, where applicable

  33. Not really important but you cannot have more than what you ask, no matter the merit Criteria and Indicators • “Values” of criteria • Cost of research • Rationale

  34. Criteria and Indicators • “Meanings” of indicators • Exceptional • Outstanding • Very strong • Strong • Moderate • Insufficient

  35. Criteria and Indicators • “Meanings” of indicators • Exceptional • In 2012, I did not see any exceptional ratings • Outstanding • Very strong • Strong • Moderate • Insufficient

  36. Criteria and Indicators

  37. Criteria and Indicators • NSERC rating form • NSERC provides a 2-page rating form • In 2012, I found that this rating form does not follow the presentation pattern during the competition slot because it spreads information • In 2012, however, each application was obviously rated according to the 4 criteria and 6 indicators

  38. Criteria and Indicators • NSERC rating form (1/2)

  39. Criteria and Indicators • NSERC rating form (2/2)

  40. Researcher Proposal HPQ Criteria and Indicators • My own rating form

  41. Process in a Nutshell Candidate’s point of view Internal reviewer’s point of view Off-line work Competition week Funding decisions In a nutshell Bins ER vs. ECR Criteria and indicators “Values” of criteria “Meanings” of indicators NSERC rating form My own rating form Advices Introduction Excellence of the researcher Merit of the proposal Contribution to the training HQP Form 100 Form 101 Conclusion Further readings Outline

  42. Advices • Introduction • Reviewers receive 2-3 dozens of applications • Overall, upon firs review, the quality is impressive, thus generating a positive reaction • The objective is to discriminate, however, initiating a vigorous search for flaws

  43. Advices • Introduction • Reviewers may perceive aspects of applications as confusing, ambiguous, incomplete, or just not compelling • They will not give the benefits of the doubt • In 2012, I witness some excellent researchers receiving low ratings because of sloppiness in their applications

  44. Advices • Introduction • Reviewers will most likely “mine” the Forms 100, 101, and publications to make up their minds regarding the 4 criteria Make it easy for them to mine your applications!

  45. Form 100 Form 101 Advices • Introduction

  46. Advices • Introduction • Form 100 • Is used for two of the three important criteria • Form 101 • Is used for the merit of the proposal mostly

  47. Excellence of the Researcher • Form 100 • Career: pages 1-2 • Funding: pages 3-… • Visibility • “Other Evidence of Impact and Contributions” • Awards, chairing, editorship, organisation, seminars: anything showing external acknowledgments • Publications • “Other Research Contributions” • Quantity and quality

  48. Excellence of the Researcher • Form 101 • Essentially nothing • Contributions • Important for the experts, should be explained for the non-experts in Form 100, “Most Significant Contributions to Research” • External reviewers • Confirm/contrast findings in the Form 100, 101, and the publications

  49. Merit of the Proposal • Form 101 • Context • Is the application well positioned? • Lacks • Any problems not discussed? • Incremental? • How innovative? • Applicable? • Usefulness, even remote? • Feasible? • Methodology

  50. Merit of the Proposal • Form 101 • Reviewers may also look for • Knowledge of the key issues (background) • Originality and innovation (background limits) • Clarity of scope and objectives • Methodology • Trust/confidence that you can do work • Significance

More Related