1 / 23

Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation

Institute for Researchinformation and Quality Assurance. Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation Session F “Talking about Success” April 25 th 06 - 9.40 – 12.30 a.m. Stefan Hornbostel Saskia Heise

rusty
Download Presentation

Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Institute for Researchinformation and Quality Assurance Is success divisible? How to evaluate success successfully?! New Frontiers In Evaluation Session F “Talking about Success” April 25th 06 - 9.40 – 12.30 a.m. Stefan Hornbostel Saskia Heise IFQ Institut für Forschungsinformation und QualitätssicherungInstitute for Researchinformation and Quality Assurance Godesberger Allee 90 D-53175 Bonn www.forschungsinfo.de Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  2. IFQ-Project „Final Reports“ • The work invested in final reports receives little to no appreciation: • for the scientists only the findings and publications from a project are of importance and not writing a final report • for the experts the judgment about the funded project is given • from the viewpoint of the funding agency a project is finished with the end of its promotion. • We suggest that the pieces of information included in a final report should be suitable for the monitoring of research activity. On the basis of our project, we would like to investigate whether final reports - in particular the documented output of a project - could be used as an evaluative tool in the DFG research funding. Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  3. IFQ-Project „Final Reports“ • Targets of our project “Final Reports" are • to build up a research monitoring system • to develop an information tool which provides web-access to the findings Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  4. German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) • Guidelines for final reports (Manual) • General Information (DFG-reference-number, applicants, project leader, institution, topic, promotion period, publications) • Research Activities and Findings (exploratory questions and target, conducted work – in particular discrepancies, scientific failures – findings, application, connections, usability, patents, industrial co-operations, co-operation partners, project collaborators, diploma, PhD-theses, postdoctoral qualifications) • Summary(generally understandable presentation, unexpected results in the developing process, dissemination of information outside the scintific community - possibly press coverage) Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  5. References for drafting of an expert's report on final reports • Short Expertise • Comment on the form of the report (outline, layout) • Comment on the content (level of achievement, duration, methodology) • Evaluation of the findings (proportionate awareness and promotion, quality of publications, additional requirements) • Additional Comments Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  6. In Final Reports of DFG-funded projects included information, Heise 2006 Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  7. “What relevant information could be asked from the project leaders and how?”, Färkkilä 2004 Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  8. Which of the following purposes of research evaluation apply to your organisation? N = 15 European Funding Agencies Hornbostel/Heise25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  9. What does success mean from the viewpoint of a funding agency? Hornbostel/Heise25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  10. 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  11. Hornbostel/Heise25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  12. Level: Individual Scientist (multi-authorship) Fractional Counting Complete Counting Paternity Test Shareholder (author, co-author, co-writer sub-author, contributor, hyper-author) Creatership 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  13. Level: Project 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  14. “Furthermore, many sources of research funding expect researchers to acknowledge any support that contributed to the published work. Just as citation indexing proved to be an important tool for evaluating research contributions, we argue that acknowledgements can be considered as a metric parallel to citations in the academic audit process.” C. Lee Giles and Isaac G. Councill: Who gets acknowledged: Measuring scientific contributions through automatic acknowledgement indexing. In: Proceedings of the National Academy of Sciences 101(51) pp. 17599-17604, Dec. 21, 2004. • Of 335,000 unique research documents within the CiteSeer computer science archive 188,052 were found to contain acknowledgements (roughly 56%) 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  15. “both the German Science Foundation (Deutsche Forschungsgemeinschaft) and the United Kingdom Engineering and Physical Sciences Research Council (EPSRC) display a steady upward trend in the proportion of acknowledgements received each year during the 1990’s while the Office of Naval Research and IBM slowly become overshadowed by other entities over the decade”. C. Lee Giles and Isaac G. Councill: Who gets acknowledged: Measuring scientific contributions through automatic acknowledgement indexing. In: Proceedings of the National Academy of Sciences 101(51) pp. 17599-17604, Dec. 21, 2004. 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  16. Multidimensional Access Methods - Gaede, Günther (1997)(Correct)(253 citations)partially supported by the German Research Society (DFG/SFB 373) and by the ESPRIT Working Group CONTESSAwww.wiwi.hu-berlin.de/~gaede/survey.rev.ps.Z Efficient PRAM Simulation on a Distributed Memory Machine - Karp, Luby, der Heide (1992)(Correct)(72 citations)Science Institute at Berkeley supported in part by DFG-Forschergruppe "Effiziente Nutzung massivftp.uni-paderborn.de/doc/techreports/Informatik/tr-ri-93-134.ps.Z Decoding Choice Encodings - Nestmann, Pierce (1996)(Correct)(52 citations)supported by the DFG, Sonderforschungsbereich 182, project C2, and bywww.cs.auc.dk/~uwe/self/doc/concur96.ps.gz Statistical Models for Co-occurrence Data - Hofmann, Puzicha (1998)(Correct)(24 citations)was supported by the German Research Foundation (DFG) under grant #BU 914/3-1. 1 Introduction Thepublications.ai.mit.edu/ai-publications/1500-1999/AIM-1625.ps Multiresolution Analysis of Arbitrary Meshes   (222 Citations) Matthias Eck, Tony DeRose, Tom Duchamp, Hugues Hoppe, Michael Lounsbery, Werner Stuetzle (1995)This work was supported in part by a postdoctoral fellowship for the lead author (Eck) from the German Research Foundation (DFG), Alias Research Inc., Microsoft Corp., and the National Science Foundation under grants On Evaluating Decision Procedures for Modal Logic   (58 Citations) Ullrich Hustadt, Renate A. Schmidt (1997) We thank Christoph Weidenbach and Andreas Nonnengart for their critical comments. The work of the second author is supported by the TraLos- Project funded by the DFGhttp://www.ag2.mpi-sb.mpg.de/~schmidt/publications/MPI-I-97-2-003.ps.gz An asymptotically optimal multiversion B-tree   (57 Citations) …….Seeger, Peter Widmayer (1996)We want to thank an anonymous referee for an extraordinary effort and thorough discussion that led to a great improvement in the presentation of the paper. This work was partially supported by grants ESPRIT 6881 of the European Community and Wi810/2--5 of the Deutsche Forschungsgemeinschaft DFGhttp://medoc.springer.de:9999/Journals/vldb/tocs/../papers/6005004/60050264.ps.gz 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  17. What does success mean from the viewpoint of a recipient of a grant? Within the preparations of our project “Final Reports” we held explorative interviews (by telephone, average 20 min. duration) with DFG-applicants and asked them, how they would appreciate assignments between project results and producers. Experts from the disciplines chemistry, engineering and educational science participated in the study. Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  18. Chemistry The interviewed chemists generally find it possible to assign research results to individual projects. With the exception that in interdisciplinary projects or networking it comes to cross-linking and project overlaps, stimulation of projects from another context and therefore no one-to-one allocation of ideas is possible. Though they consider it possible to assign authors to a project context and to published findings. Without exception the asked applicants from chemistry acknowledge the third-party-funding organisation in all publications that emerge from the project. In the case of multiple funding, they name every involved funding agency. Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  19. Engineering Scientific engineers consider it difficult to relate project output exactly to one project. Results emerge from interal co-operations with other professorships, interactions between scientists that may be relevant for different projects. While relatively unquestionably publications can be appropriate to authors, the allocation from findings to project contexts is more difficult, due to parallel projects, and is only later reconstructed. Engineers co-operate with scientists and industry, co-operation is sometimes agreed only for the duration of a certain project, sometimes for longer periods. Though in the end, all asked scientific engineers acknowledge the third-party-funding organisation in their publications. Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  20. Educational Science The asked scientists consider it possible to assign project output and publications to a concrete project context. It may happen that publications from a project context emerge years later (5-10 years). Publications can be related to persons, results are always clearly recognizable. All asked persons acknowledge the third-party-funding promoter. In educational science this is considered very important – funding by the DFG stands for quality. In educational science co-operation exists for longer periods of and time is usually concretized in preposition of a project framework. Hornbostel/Heise 25.04.2006 Institut für Forschungsinformation und Qualitätssicherung

  21. Increasing Pressure Peer Assessment Grant / Project Evaluation Funding Agencies call for Acknowledgments (as a kind of „Property Right“) Scientists feel obliged to assign outcome to projects Use of acknowledgments as indicators 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  22. Lessons learned: • Success has many fathers, but failure is an orphan • The relation between project and project outcome is a social construction to the same degree as relationship between author and publication • What we measure is the construction set by scientists, not the causal dependency • The evaluation window is not identical with the term of a project • Project outcome is more than publications (Phds, networks, transfer, patents, dissemination of information outside the scientific community …. – some of them hard to meassure) • Therefore success-indicators (like citations etc.) should be combined with peer assessment of final reports • Objectives of systematic use of final reports are - information about programme performance - information for applicants - information for peers about former performance of applicants 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

  23. Thank you ! 25.04.06 Institut für Forschungsinformation und Qualitätssicherung

More Related