1 / 44

Review of Quantitative Monitoring System of Chinese Academy of Sciences

Review of Quantitative Monitoring System of Chinese Academy of Sciences. ZHENG Haijun, GUAN Zhongcheng , WANG Biaoxiang , WU Jianmei Management Innovation and Evaluation Research Center, CAS Institute of Policy and Management, CAS AEA 2013 , Washington DC. About IPM and ERC of CAS.

zinna
Download Presentation

Review of Quantitative Monitoring System of Chinese Academy of Sciences

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Review of Quantitative Monitoring System of Chinese Academy of Sciences ZHENG Haijun, GUAN Zhongcheng, WANG Biaoxiang, WU Jianmei Management Innovation and Evaluation Research Center, CAS Institute of Policy and Management, CAS AEA 2013, Washington DC

  2. About IPM and ERC of CAS • IPM • Found in June 1985 • Devoted to the studies of development strategy, development and reform policy, public administration, S&T management and state-of-the-art theories and methodologies of related disciplines • Faculty: 138 • ERC of CAS (also affiliate to IPM) • Faculty: 14 • Mission • Management Innovation and Evaluation Research, institutes evaluation

  3. Quantitative Monitoring is Necessary 1

  4. Quantitative Monitoring • Peer Review • Quantitative Monitoring(QM):widely used management tool for international institutions • NIH • MPG • INRIA • SoSP

  5. Quantitative Monitoring in CAS 2

  6. Development of Quantity Monitoring of CAS

  7. Major Outcomes Oriented Ev. Comprehensive Quality Ev. • >Key indicators • +Innovation Capacity • >No more rank Yellow Book Ev. • >Innovation Capacity • >No more rank White Book Ev. Blue Book Ev.

  8. Blue Book Evaluation(1993-2001) • QM=Evaluation • Totally Quantity Evaluation • 1993-1999: rank within one system • 1998-2001: rank within three category series • Basic Research • Applied Research • Development and High Tech Research Industrialization

  9. Blue Book Evaluation(1993-2001)

  10. Blue Book Evaluation(1993-2001)

  11. Blue Book Evaluation(1993-2001) • Evaluation results were revealed according to indicators • Rank by score into four categories A, B, C and D within each indicators • Performance and Status in forms of Category • Results of one institute from different years are not comparable

  12. White Book Evaluation(1999-2001) • Objective completion status + basic, strategic and prospective contribution • Objective completion status Ev., Peer Review • S&T objective Ev. • Contribution of work • Influence to subject, etc. • Management objective Ev. • Does fixed staff turnover rate exceed 5%? • Does the usage of Special Funds for KIP exceed 60%? Etc.

  13. White Book Evaluation (1999-2001) • Evaluation of basic, strategic and prospective contribution: QM

  14. White Color Book(1999-2001)

  15. White Book Evaluation (1999-2001) • Qualitative evaluation has been introduced to White-Book evaluation • Evaluation = QM+Peer Review • Scores, not classification results • Turned qualitative evaluation results into quantitative data, then got the ranking Scores based on all quantitative data • Characteristics of institutes disciplines is not enough, and QM is still a primary method

  16. Yellow Book Evaluation (2002-2004) • Started to focus on quality over quantity • Evaluation System • Major Innovation Contributions Evaluation, Peer Review • Basic Indicators Evaluation, conducted by CAS headquarter bureaus • Classification Oriented Evaluation, for all institutes in CAS • basic research • high technology research and development • resources, environment and sustainable development • Industrialization

  17. Yellow Color Book(2002-2004)

  18. Yellow Book Evaluation(2002-2004) • Evaluation = QM + Peer Review (similar to White Book Evaluation) • Started to pay more attention to • major innovation contributions according to series • subject’s specialty • Monitoring results is reflected as score within each category series

  19. Comprehensive Quality Evaluation (2004 - 2011) • Ten key process are included in the evaluation • self-evaluation of institutes • strategic planning for next stage • peer review for existing outcome • comprehensive analysis of previous evaluation • Communication review • on-site assessment • president office conference etc. • Final conclusions and opinions are based on results of diverse contents of strategic decisions • Qualitative Evaluation, and QM play an important part in decision making

  20. Comprehensive Quality Evaluation (2004 - 2011) • Before 2004 • quantitative monitoring gradually returns to its original function • a powerful tool to compare among institutes by providing year-based information • cannot display developing trend of institutes within certain years in a more comprehensive way

  21. Innovation Capacity Indicator (ICI) Monitoring 3

  22. Design of Innovation Capacity Indicator (ICI) • Law of Comparative Advantage, David Ricardo, 1817 • In economics, comparative advantage refers to the ability of a party to produce a particular good or service at a lower marginal and opportunity cost over another. Even if one country is more efficient in the production of all goods (absolute advantage in all goods) than the other, both countries will still gain by trading with each other, as long as they have different relative efficiencies.

  23. Design of Innovation Capacity Indicator (ICI) • Indicators • Status • Outcomes • Impact • Weights • Weights of indicator are based on simulation of years historical data • Evaluate the importance of parents indicators • Set up the price according to data of a certain year

  24. How to Choose Indicators • CAS has 100+ institutes, and covers most of disciplines • Indicators should reflect • common characteristics of institute • respective expertise and features

  25. How to Choose Indicators • Expert group discussion, and approval of most institutes • 7 Indicators to monitor

  26. Excellent Talents Researches Funds High Quality Papers Consultant Reports IP S&T Awards Completed Major Programs

  27. How to Choose Indicators • To some degree, the indicators of ICI consider different characteristics of different institutes • excellent talents and scientific research fund are necessary and common indicators • high quality paper is the common indicator to reflect the output of fundamental research institutes

  28. How to Choose Indicators • Intellectual Property is an important output indicator in high technology institute • Completed Major Programscan reflect how it meets the demand of the nation • In addition, the characteristics indicators are included in the child indicators • National Natural Science Awards • National Technological Invention Awards • National S&T Progress Awards

  29. How to Choose Indicators • Number of outcomes increasing rapidly

  30. How to Choose Indicators • Higher Quality Requirements in Choosing Indicators • For example, papers • published in journal whose impact factor is among top 15% within this discipline (JCR&SJCR)

  31. Weights of Indicators • Management experts and strategy experts made a group decision making on the weights of ICI, based on evaluation orientation, management experience and a large amount of simulation computation

  32. Weights of Indicators • Using the weights of 7 indicators, we can determine the price of parent indicators based on the data of 2003 and 2004. • Weights of child indicators, determined by its importance • One 973 project worth 5 points,and one NSFC program worth 2 points.

  33. Computation , • is the value indicator of institute • is the price of indicator • is the weights of indicator • is the capacity of institute

  34. How to Consider Efficiency • total amount of seven indicators are divided by the number of innovation positions or regular budget • Regular funding for KIP • If Parent Indicator represent money • # of Positions for KIP • If Parent Indicator does not represent money

  35. ICI for CAS • Great support for management of CAS • Institutes can also recognize their positions among similar institutes • Development trend of CAS and its institutes in recent years

  36. ICI for CAS

  37. Conclusion 4

  38. Conclusion • QM systems fits the improvement of the research quality of CAS Institutes • Rapidly promoted the outcomes of CAS • ICI is a useful management tool

  39. Limitations of ICI • Some indicators is bad for some institutions • High Quality Paper • Small discipline • ICI does not reflect all the characteristics of an institute • Mainly common indicators • Some meaningful works are difficult to quantify • Large-scale scientific facility • Scarce Resources

  40. Conclusion • Peer review is getting more attentions from CAS • ICI becomes a tools of monitoring, neither ranking or comparing among institutes • 2012, Major Outcomes-Oriented Evaluation, QM is an important part of the evaluation system • Monitoring Key indicators + ICI • Focus on diagnostic function of their own development

  41. ZHENG Haijun haijzheng@casipm.ac.cn GUAN Zhongcheng guan@casipm.ac.cn Thanks WANG Biaoxiang xwbill@casipm.ac.cn WU Jianmei wujianmei@casipm.ac.cn

More Related