1 / 18

Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009

Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009. Toolbox CRC programme managers – Dag Kavlie, RCN. Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009. Toolbox CRC programme managers – Dag Kavlie, RCN.

ismet
Download Presentation

Analysis of indicators used for CRC Monitoring and Evaluation Ljubljana, 15 September 2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Analysis of indicators used for CRC Monitoring and EvaluationLjubljana, 15 September 2009 Toolbox CRC programme managers – Dag Kavlie, RCN

  2. Analysis of indicators used for CRC Monitoring and EvaluationLjubljana, 15 September 2009 Toolbox CRC programme managers – Dag Kavlie, RCN

  3. CRCs have in some important aspects in common Definition by COMPERA: Structured, long termRTDI collaborations in strategic important areas between academia, industry and the public sector Aim: Bridge the gap between scientific and economicinnovation by providing a collective environment for academics, industry and other innovation actors and creating sufficient critical mass Multiple activities: Pooling of knowledge, creation of new knowledge by performing different types of research, training and dissemination of knowledge, networking, …

  4. CRC Programmes are different in some aspects Primary characteristics • Focus on Research as a knowledge basis for Innovationvs Innovation Other important features • Physical centre vs Network, (virtual centre) • Separate Legal Entity vs Centre within Host institution • Regional (national) focus vs International • Active participation in centre activities by enterprises vs cash contribution • Duration of funding, life after end of funding of centre • Size of Budget and Funding profile • Open call vs Predefined thematic area • Industry led vs Academic lead

  5. Positioning of CRCs is crucial to understand the type of impact and time perspective we should expect Yes Goal Scientific progress No No Yes Goal Economic or social use Research in focus Areas of New technology Needs-driven Basic research (Pasteur) Curiosity driven Basic research (Bohr) Innovation in focus Industrial research (Edison) Increased ”end market” driven

  6. Physical Centres vs Networks • For a Physical centre the researchers are located in the same premises • Virtual centres may also have research groups that are located together but at different locations • For Networks the researchers are generally located in the institutions that are the partners of the network

  7. Some main featuresofpresentCRC Programmes

  8. What is the purpose of indicators (NSF) • Collection of some standard measures of performance across all CRCs in a programme (Top down indicators) • Information base for Programme management • Information base to help centre Evaluators (May also include bottom up indicators for each centre) • Information base for Agency report to funding Ministries on Programme achievements

  9. CRC-AustraliaMeasurements along the input to impact chain INPUT • People, Money, Infrastructure, Prior IP ACTIVITY • Research projects, Stakeholder engagement, Training OUTPUT (Result) • Publications, Prototypes, Patents, Ph.ds, Masters, IMPACT (Outcome) • Gains in Productivity, Industrial development, Health and Environmental benefits

  10. Indicators should have a close relation to the goals for a centre Example of expected impacts (outcomes) : • Research in the forefront within thematic area • Knowledge basis relevant for industrial partners • Training of researchers in areas important for industry • Internationalisation • Increased R&D spending of business partners • Innovations by partners • Impact on industry and society at large

  11. Example: Output/Outcome indicators for Canadian NCEs Performance area 1: Researcher training and Recruitment • Number of Post docs • Number of Ph. D students • Number of Master students • Number of candidates from centre that are employed in industrial sector Performance area 2: Transfer and exploitation of results by industry • Number of patent applicatons and patents issued • Number of license agreements and income by licenses • Number of new products, services and processes • Case studies demonstrating impact Performance area 3: Increased productivity and economic growth • Number of jobs created • Examples of companies created in new industrial sectors • Case studies of impact on existing industries • Benefit/cost analysis

  12. Management and organisation Innovation and Value creation Research Recruitment International cooperation ” Example of a performance areascovered in the midway evaluation for a Norwegian CRC Must identify relevant indicators for each of these areas

  13. Some important observations • Useful with some top down indicators common for all centres and connected to the strategic aims of the programme • Allow centres to formulate their own bottom up indicators which are considered particularly relevant for each centre • The time domain must be taken into account • Indicators are only one element to be considered in an evaluation of a centre or programme • Case study of success connected to performance areas valuable part of an evaluation • A common European ”best practice” is not to be aimed for, even if competence centres have much in common • Much can be learned by study of the indicators used for different programmes together with the strategic aims of the programme

  14. Analysis of indicators used by COMPERA members Feedback from eight COMPERA members Analysis along three main dimensions • Research • Innovation • Centre dimension

  15. Research dimensionResult (Output) indicators commonly used • No of approved EU-projects within the centre’s field of operation (6) • No of published papers in refereed journals (5) • No of international conference contributions (5) • No of projects with international partners (4) • No of Co-publications with industrial partners (3) • No EU-projects with role as coordinator (3) • No of M Sc degrees connected to the centre (3) • No of Ph.d students working in the centre (3) • No of international visiting researchers (2)

  16. Innovation dimensionResult (Output) indicators commonly used • No of Patent applications (5) • No of new enterprise partners (5) • No of project results that are protected by other than patents (trademarks etc)(4) • No of projects with active involvement of enterprise partners(4)

  17. Centre dimensionResult (Output) indicators commonly used • Active involvement of enterprises in Research agenda setting (6) • No of centre events like workshops, seminars etc (5) • Volume of additional funding (4) • Communication - Press cuttings related to centre (3) • Mobility of Staff between partners (2)

  18. Impact (Outcome) indicators commonly used Research • No of Ph.d theses completed (3) • Increase in R%D spending by enterprise partners (4) Innovation • No of Patents (5) • No of licences based on patents (4) • No of new Products, Processes and Services (5) • No of Spin-of companies (4) • Recruitment of personel from academia to industry (4)

More Related