1 / 19

Monitoring and evaluation of science, technology & innovation An International Perspective

Monitoring and evaluation of science, technology & innovation An International Perspective. M&E a core element in strengthening national S&T policies. Accountability : Demand from elected officials and the public for accountability for the use of resources

marymmartin
Download Presentation

Monitoring and evaluation of science, technology & innovation An International Perspective

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Monitoring and evaluation of science, technology & innovationAn International Perspective

  2. M&E a core element in strengthening national S&T policies • Accountability: Demand from elected officials and the public for accountability for the use of resources • Policy-Making: Guidance on policy choices, reforms and innovations, and knowledge of bottlenecks in national innovation system • Program Improvement: Information to track implementation of S&T policies and to decide on improvements

  3. Key challenges to developing M&E systems in Latin America • Lack of transparency in results achieved with public resources invested in R&D • Fragmentation in responsibilities for collecting and reporting data • Unreliable statistics, notably in regard to private sector R&D • Low capacity to process and analyze data

  4. Monitoring and evaluation • Monitoring: continuous process of collecting and analyzing information to compare how well a project, program or policy is being implemented against expected plans or results • Evaluation: systematic assessment of planned, on-going or completed interventions to determine their relevance, appropriateness, effectiveness, efficiency, impact or sustainability

  5. Responding to the demand for information • Developing theory of change and sector strategy • Determining what and how to measure • Organizational arrangements for collecting information • Capacity to analyze and use collected information

  6. Developing theory of change and sector strategy – determining what to monitor • Strategy implies the movement of a sector from its present stage to a desirable but uncertain future stage. Because the country/sector has never been to this future position, its intended pathway involves a series of linked hypotheses • A M&E module is a method to document and test the assumptions inherent in the strategy

  7. Measuring the middle ground Inputs Goal(Impacts) • Generally weak articulation of the mechanisms through which inputs and activities are expected to result in the desired impact • Milestones and inter-mediate measures are important in S&T where final outcomes may take years to emerge ? Results Outcomes ? Outputs Activities Intervention

  8. Causality tree Sustained growth • Make relationships explicit so they can be monitored, validated and managed • Represents the best guess as to an appropriate course of action, given the best available knowledge Transition to a knowledge- based economy Improved human capital Strengthened Chile’s national innovation system Continualmonitoring and evaluation Conducive innova-tive policies and instruments Enhanced public-private linkages and innovation Improved private sector research capacity State of the art scientific community Insertion of researchers into industry AdvancedHuman Capital Excellenceinscience Grants forinternationalcooperation Acquisitionof S&T equipment Cooperative research consortia Coherent strategy for innovation and human capital

  9. Indicators and incentives • ‘Indicators’ rather than ‘measures’ • Risk: “What you measure is what you get” • Academic research: risk of cementing an ‘ivory tower’ mentality by assigning too much importance to bibliometrics • Develop indicators for desirable behavior, e.g. collaboration, entrepreneurship, mobility

  10. Determining what and how to measure Types of indicators • Direct measures, e.g. annual PhD graduation and bibliometrics • Proxies, e.g. average country citation impact factor • Perception variables, e.g. transparency of funding allocation mechanisms Apply international standards and data manuals (Oslo and Frascati) to facilitate international benchmarking

  11. Challenge of measuring private sector innovation • Understanding determinants for turning science into business essential to ensuring that investments in S&T translated into increased competitiveness and growth • Company innovation surveys, e.g. Chile (1994-95; 1997-98; 2000-01) • Challenge: To be useful, content needs to be agreed between several sectors and agencies, e.g. Ministry of Education, Ministry of Economy and National Statistical Agency

  12. Marginal effect of company characteristics on patents, product innovation and process innovation Interpretation: Collaboration with universities increases the probability 35 percent of being involved in patenting, keeping all other factors constant

  13. Organizational arrangements for collecting information • Assigning responsibilities: Clear agreement on who does what, when and how • Consider the cost of the M&E system, e.g. build on existing reporting structures • Providing an incentive to collect and report valid data by ensuring that requested information is useful for decision-making at both central and decentralized level • Avoiding heavy bureaucratic burdens on researchers taking time away from core activities

  14. Systems of performance reporting • Use standardized reporting to increase transparency, ensure comparability and reduce administrative load • Essential to ensure collection of longitudinal data rather than a number of cross-sections

  15. National capacity for analyzing data • Unit with skills and capacity to process and analyze data, undertaking studies, and suggesting policy implications • Chile: National Observatory for S&T • Technical rigor in designing evaluations (process, impact): challenge of using quasi-experimental designs in small innovation systems using competitive resource allocation mechanisms

  16. Impact evaluation: Difference-in-difference design T2 T1 Treatment group Competitively selected ΔYT ΔYT-ΔYC ΔYC Control group • Cost-effectiveness: Evaluate against best alternative investment rather than the no treatment scenario

  17. Impact evaluation: Challenge of selection bias T2 T1 Treatment group ΔYT Ex-ante ΔYT ΔYT-ΔYC Selection bias: Controlling for past trends ΔYC Control group • A baseline is a line, not a point

  18. Summing up • M&E essential for developing and maintaining effective ST&I polices • Need to ground M&E in national strategy for ST&I • Indicators shape incentives • Base M&E systems on international data manuals • Place increased emphasis on understanding the determinants for turning science into business • Develop capacity to use M&E for learning and improving ST&I policies

  19. Thank youKristian Thornkthorn@worldbank.org

More Related