1 / 19

Archived File

Archived File. The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated. See the OER Public Archive Home Page for more details about archived files. EUREKA.

Download Presentation

Archived File

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Archived File The file below has been archived for historical reference purposes only. The content and links are no longer maintained and may be outdated. See the OER Public Archive Home Page for more details about archived files.

  2. EUREKA

  3. EUREKA!(Exceptional, Unconventional Research Enabling Knowledge Acceleration) Can we encourage scientific leaps? • Leaps instead of incremental progress • NIH support of some risky, high-impact research EUREKA designed to support innovative research • Unconventional hypothesis or solution for major technical/methodological challenges • High impact on scientific community • Emphasis on idea rather than PI • Specifics: • R01 (4yrs, 200k DC/yr) • NIGMS, NINDS, NIHM, NIDA • Multiple PIs allowed

  4. Unconventional ApplicationFocus on novelty and magnitude of impact Research Plan • 8 page limit • 3 pages Approach • 1 Specific Aim • Must Address: • Challenge • Potential Impact • Approach • Fit for EUREKA • Likelihood of Success • Timeline • Biosketch • 4 page limit • Cite up to 10 pubs • State how pubs show PI’s relevant knowledge/experience, innovation, or broad impact • Other Features • 1 page Literature Cited • Preliminary data not required • No appendix material

  5. EUREKAReview Challenges • Innovative ideas may not generate consensus • Balancing need for expert opinion with that from reviewers outside of the field • Large number of applications and broad range of science

  6. Comparison of EUREKA reviews by NIGMS and by NIDA/NIMH/NINDS • Both used a 2 phase review; • Both used a structured review form for phase 1; • Both used whole numbers for preliminary scores; • NIGMS used AED for phase 2; Neuro used an in person meeting; • Neuro used initial “binning” of discussed applications, followed by a final scoring phase;

  7. Comparison of EUREKA reviews by NIGMS and by NIDA/NIMH/NINDS

  8. EUREKA Review: NIGMS details

  9. Phase 2 Discussion Final Scoring Phase 1 Review Streamline • Balance workload with chance of seeing innovative application • 52 reviewers for 226 applications (single panel) • 15 applications per reviewer (typical), 3 reviewers per application • Expertise vs. broad knowledge • At least one reviewer had ‘good’ expertise match • Reviewers outside of field appreciate ‘expert opinion’ • Focus reviewers on EUREKA criteria • Telephone orientation conferences • Structured review form emphasizing unique features of EUREKA • Preliminary scores in whole numbers (1, 2, 3, 4, 5)

  10. Review form highlights criteria most relevant to EUREKA • Significance • Innovation • Approach • Investigator • Environment • Overall score • Overall comments Comments REQUIRED for top 3 applications Portion of 2 page Review Form

  11. Phase 2 Discussion Final Scoring Phase 1 Review Streamline Innovative ideas may not generate consensus Prioritize applications for full discussion based on best score instead of average score Two or more reviewers found it ‘extremely exciting’ (2 or more ‘1’s) Discuss 43 out of 226 OR One reviewer found it ‘extremely exciting’ and requested that it move forward

  12. Phase 2 Discussion Final Scoring Phase 1 Review Streamline • Discuss and score most innovative applications • Single panel • Asynchronous Electronic Discussion (web-based) • Allowed majority of phase 1 reviewers to participate • Reviewers may be more comfortable expressing opinion • Reviewers had time to read or re-read applications

  13. EUREKA Review: Neuroscience details

  14. Neuroscience EUREKA Review • Collaboration involving NIMH, NINDS and NIDA • Mark Swieter (NIDA) • David Armstrong, Megan Libbey & Vin Charles (NIMH) • Phillip Wiethorn, Bill Benzing & Alan Willard (NINDS)

  15. Applications were assigned to one of seven broad scientific categories

  16. Phase 1: Five reviewers were assigned to each of the seven scientific categories

  17. Each Reviewer Evaluated Every Application Within Their Assigned Scientific Category • Their major emphasis was placed on innovation and significance. • Review criteria were individually rated on a template. Comments encouraged but not required. • Overall score provided (1-5) using whole numbers. • Suggested target was to give a score of “1” for top 4 applications in each bin. Actual ratings by individual reviewers ranged from 2 to 6 scores of “1”. • Reviews submitted electronically 3 weeks before the Phase 2 “in person” meeting.

  18. Example of a Criterion on the Neuroscience EUREKA Review Template Significance: This study addresses an important problem and the outcome of the proposed studies will drive the field. The potential impact of the proposed research is exceptional, in terms of the magnitude of the impact and the size of the community affected. Strongly Agree ___ Moderately Agree ____Neither Agree nor Disagree ___ Moderately Disagree ___ Strongly Disagree ___ Comments:

  19. Neuroscience Phase II: “In person” meeting • 17 of the 35 reviewers (circa two per scientific category) participated in an “in person” meeting • Ten days prior to the review meeting, participants held a teleconference and determined a tentative triage list based upon initial electronic postings. • Scientific categories were collapsed and reviewers were asked to evaluate all applications in the discuss category (circa 30% of total) • Following discussion applications were initially assigned to one of four bins – best (1.0 -1.5), better (1.5-2.0), good (2.0-2.5), unscored • After all applications had been discussed, initial bin assignments were re-evaluated and final scores cast.

More Related