1 / 82

Moscow April 27, 2016

The use of indicators for evaluation and policy making. Giorgio Sirilli Research Associate. Moscow April 27, 2016. Outline of the presentation. STI indicators Science and technology policy Evaluation the need the socio-political dimension cost experience of Italy

npatterson
Download Presentation

Moscow April 27, 2016

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The use of indicators for evaluation and policy making Giorgio SirilliResearch Associate Moscow April 27, 2016

  2. Outline of the presentation STI indicators Science and technology policy Evaluation the need the socio-politicaldimension cost experienceof Italy impact on universities Concludingremarks

  3. Indicators Statistic: A numerical fact or datum, i.e. one computed from a sample. Statistical data: Data from a survey or administrative source used to produce statistics. Statistical indicator: A statistic, or combinations of statistics, providing information on some aspect of the state of a system or of its change over time. (For example, gross domestic product (GDP) provides information on the level of value added in the economy, and its change overtime is an indicator of the economic state of the nation.)

  4. Indicators Indicators are a technology, a product, which - governs behaviour - is modified by users (outside of the producer community) - develops in response to user needs Data sources • Surveys, administrative data, private files, case studies • Data collection is informed by manuals Data populate statistics which can be indicators Decisions are taken on the basis of indicators

  5. Indicators S&Tindicators are defined as “a series of data which measures and reflects the science and technology endeavor of a country, demonstrates its strengths and weaknesses and follows its changing character notably with the aim of providing early warning of events and trends which might impair its capability to meet the country’s needs”. Indicators can help “to shape lines of argument and policy reasoning. They can serve as checks, they are only partof what is needed”. (OECD, 1976)

  6. Indicators Research and development (R&D) Innovationstatistics Intangibleinvestment Patents Technologybalanceofpayments Tradeof high-tech products Humanresources Venture capital Bibliometrics Public perceptionof science and technology ……….. ………..

  7. Keith Pavitt “Onewouldthinkthat the political agenda determines the collection and analysisofindicators. In reality itis the other way round: itis the availabilityofindicatorswhichsteers the politicaldiscourse.”

  8. Fred Gault “Policy analystsshouldbebothliterate and numerate, ableto put a case usinginnovationindicators. Notonlyshould the analysitshavesuch a skill set, buttheyalsorequire some knowledgeof the subject. Itis in thisenvironmentthatmonitoring, benchmarking and evaluationleadto policy learning and to more effectivepolicies.”

  9. The user of indicators: an acrobat

  10. The producerofindicators “The sorcerer's apprentice” (колдун) Wolfgang Goete Sorcerer “An irresponsible person who instigates a process or project which is unable to control, risking to produce irreversible damage.”

  11. S&T statisticians: arms producers? “I have no regret. Others are responsible for the bloodshed caused by the AK-47 machine gun. It is politicians’ fault non to be able to find appropriate solutions but rather to resort to violence.” (M. Kalashnikov)

  12. A brief history of S&T indicators The first attempttomeasureS&T in 1957 Frascati Manual (1963) The Frascati manual “family” A continousprocessofbroadening and deepening: from macro to micro, from public to private The roleofinternationalorganisations The dialoguebetweenproducers and users

  13. R&Dresources OECD Science, Technology and Industry Scoreboard, 2015

  14. The world is changing US EU28 BRIICS JAPAN CHINA Source: OECD. STI Outlook 2014

  15. A changing global R&D landscape GERD, million USD 2005 PPP, 2000-12 and projections to 2024 Source: OECD estimates based on OECD MSTI database, June 2014.

  16. Pubic funding to R&D and innovation

  17. Pubic funding to R&D and innovation

  18. R&D intensity Source: OECD. STI Outlook 2014

  19. The mistiqueof ranking GERD is used for target setting - from descriptive to prescriptive “The American GERD/GDP ratio of the early 1960s, that is 3%, as mentioned in the first paragraphs of the first edition of the Frascati Manual, became the ideal to which member countries would aim, and which the OECD would implicitly promote” (Godin) Lisbona EU 3% (2% business, 1% public sector)

  20. The R&D/GDP objective Objectives of R&D expenditure and differences with present levels - 2014 Source: OECD, estimetaes based on OECD MSTI database, June 2014.

  21. A rhetoricdevice: A pleatoraoffigures and graphs “Secure a quantitative statement of the critical elements in an official’s problem, draw it up in concise form, illuminate the tables with a chart or two, bind the memorandum in an attractive cover tied with a neat bow-knot (…). The data must be simple enough to be sent by telegraph and compiled overnight” (Mitchell, 1919)

  22. Istechnological progress slowing down?

  23. R&D in universities and public research agencies

  24. Science and technology policy Science policy is an area of public policy which is concerned with the policies that affect the conduct of the science and research enterprise, including the funding of science, often in pursuance of other national policy goals such as technological innovation to promote commercial product development, weapons development, health care and environmental monitoring. (Wikipedia)

  25. A brief history of science and technology policy Patronage ofrulers (ex. in the Rennaissance) Industrial revolution Between the First and the Second World Wars rockets, nuclearenergy, operationsresearch, DDT After the Second World War science and technology policy ofgovernments

  26. Science and technology policy • Vannevar Bush • “Science the Endless Frontier” 1945

  27. “Science the Endless Frontier” Issues to be addressed through science: - difence - health Solution: science policy

  28. NABS objectives Exploration and exploitation of the earth Environment Exploration and exploitation of space Transport, telecommunication and other infrastructures Energy Industrial production and technology Health Agriculture Education Culture, recreation, religion and mass media Political and social systems, structures and processes General advancement of knowledge: R&D financed from general university funds General advancement of knowledge: R&D financed from other sources than GUF Defence

  29. Evaluation Evaluation may be defined as an objective process aimed at the critical analysis of the relevance, efficiency, and effectiveness of policies, programmes, projects, institutions, groups and individual researchers in the pursuance of the stated objectives. Evaluation consists of a set of coordinated activities of comparative nature, based on formalised methods and techniques through codified procedures aimed at formulating an assessment of intentional interventions with reference to their implementation and to their effectiveness. Internal/external evaluation

  30. A question (doubt) The scientific community has always been evaluated (mostly inside) When you measure a system, you change the system

  31. In the UK Research Assessment Exercise (RAE) Research Excellence Framework (REF) (impact) “The REF will over time doubtless become more sophisticated and burdensome. In short we are creating a Frankenstein monster” (Ben Martin)

  32. Why do weneedevaluation? Need for a coherent strategy with clear priorities Need for new/improved tools to help determine: • - which current activities worth keeping • - which to cut back to allow new things to emerge • effective (evidence-based) policies for basic/strategic research

  33. Typesofdesisions in science policy • Distribution between sciences (e.g. physics, social sciences) • Distribution between specialties • e.g. high-energy physics, optical physics • Distribution between different types of activity • e.g. university research, postgraduates, central labs • Distribution between centres, groups, individuals

  34. Scope and objectofevaluation Type of research e.g. • - academic research vs targeted research • - international big-science programmes Level and object of the evaluation • - individual researcher • - research group • - project-centred • - programme • - whole discipline

  35. Criteriaforevaluation Vary according to the scope and purpose of evaluation; they range from criteria for identifying quality/impact of research to criteria for identifying value for money Four main aspects often distinguished • - quantity • - quality • - impact • - utility Criteria can be • - internal – likely impact on advance of knowledge • - external– likely impact on other S&T fields, economy and society

  36. Researchevaluation Evaluation of what: research education “third mission” of universities and research agencies (consultancy, support to local authorities, etc.) Evaluation by whom: experts, peers Evaluation of what: organisations (departments, universities, schools) programmes, projects individuals (professors, researchers, students) Evaluation when ex-ante in-itinere ex-post

  37. Evaluation in Italy • Established in 2011 • A governmentagency, notan authority • The relationshipwith MIUR (Ministryofeducation, universities and research) • ANVUR activities: • 1. Evaluationof the QualityofResearch (EQR) • 2. National ScientificQualification (NSQ) • 3. Accreditationofuniversities (AVA)

  38. Evaluation of the Quality of Research by ANVUR Model: Research Assessment Exercise (RAE) Objective: Evaluation of Areas, Research structures and Departments (not of researchers) Reference period: 2004-2010 Start: 2011 Finish: 2014 Actors: - ANVUR (National Agency for the Evaluation of Universities and Research Institutes) - GEV (Evaluation Groups) (#14) (450 experts involved plus referees) - Research structures (universities, research agencies) - Departments - Subjects evaluated: researchers (university teachers and PRA researchers)

  39. Evaluation of the Quality of Research by ANVUR Researchers’ products to be evaluated - journal articles - books and book chapters - patents - designs, exhibitions, software, manufactured items, prototypes, etc. University teachers: 3 “products” over the period 2004-2010 Public Research Agencies researchers: 6 “products” over the period 2004-2010 Scores: from 1 (excellent) to -1 (missing)

  40. Evaluation of the Quality of Research by ANVUR Indicatorslinkedtoresearch: quality (0,5) abilitytoattractresources (0,1) mobility (0,1) internazionationalisation (0,1) high leveleducation (0,1) ownresources (0,05) improvement (0,05) Attentionbasicallyhere!

  41. Evaluation of the Quality of Research by ANVUR Indicatorsof the “thirdmission” : fundraising (0,2) patents (0,1) spin-offs (0,1) incubators (0,1) consortia (0,1) archaeologicalsites (0,1) museums (0,1) otheractivities (0,2) A methodologicalfailure

  42. From science to innovation Averageadoptionlagshavedeclined markedlyover the past 200 years Source: WIPO, World Intellectual Property Report, 2015

  43. Impact of evaluation in Italy Percentage of the General university fund linked to increase of quality, efficiency and effectiveness based on the Evaluation of the Quality of Research (VQR) (for three fifths) and on recruitment policy (one fifth): 2014: 16.0% 2015: 18.0% 2017: 20.0% later: up to 30%

  44. The impact of VQR on professors/researchers BEFORE YOU GO … WOULD YOU MIND FILLING THIS OUT FOR ME? V Q R

  45. LessonsfromResearchEvaluation in Italy • Evaluation should enhance efficiency and effectiveness • Pro-active evaluation vs punitive evaluation • Evaluation is a difficult and expensive process • When a system ismeasureditischanged (opportunisticbehaviour) • Peerreview vs bibliometrics • NSE vs SSH • Competition vs cooperationofscientists • The mithofexcellence • The splitof the academic community (the good and the bad guys) • The equilibriumamongst the teaching, research and thirdmission • Bureacratisation • Usedtoassignresourcesbyministry • Evaluation in timesofcrisis

  46. Evaluationisanexpensiveexercise ResearchAssessmentExercise (RAE) 540 million Euro ResearchExcellenceFramework (REF) 1 milllionPounds (500 million) Evaluationof the QualityofResearch (VQR) 300 million Euro (180,000 “products”) 182 million Euro Ruleofthumb: lessthan 1% ofR&D budget devotedtoitsevaluation

  47. The newcatchwords New public management Value for money Accountability Relevance Excellence

  48. Evaluation Effectiveness. Whether the objectives were achieved, and whether their achievement was sufficient to change the original problem situation Value for money. The extent to which benefits exceed costs Efficiency. The cost at which objectives were achieved Appropriateness. Whether a policy or programme was suitable for the problem situation

  49. Encyclicalletter. Pope Francis Economic powers continue to justify the current global system where priority tends to be given to speculation and the pursuit of financial gain with its search for immediate interest, which fail to take the context into account, let alone the effects on human dignity and the natural environment. The technocratic paradigm tends to dominate economic and political life. The economy accepts every advance in technology with a view to profit, without concern for its potentially negative impact on human beings. Finance overwhelms the real economy. Our politics are subject to technology and finance.

More Related