1 / 28

Enterprise Ireland experience on Monitoring and Evaluation of Industry Led Competence Centres

Enterprise Ireland experience on Monitoring and Evaluation of Industry Led Competence Centres. Compera Conference, Ljubljana, Sept 15, 2009. Martin Hussey. Enterprise Ireland. Presentation Overview. Brief Introduction to Irish Competence Centres Model for Industry Led Competence Centres

sierra
Download Presentation

Enterprise Ireland experience on Monitoring and Evaluation of Industry Led Competence Centres

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Enterprise Ireland experience on Monitoring and Evaluation of Industry Led Competence Centres Compera Conference, Ljubljana, Sept 15, 2009. Martin Hussey. Enterprise Ireland

  2. Presentation Overview Brief Introduction to Irish Competence Centres Model for Industry Led Competence Centres Previous Model (PAT) experiences CREST ILCC Best Practice Plan for Future Discussion

  3. Enterprise Ireland EI’s mission is to accelerate the development of world-class Irish companies to achieve strong positions in global markets resulting in increased national and regional prosperity. Our clients are Irish companies engaged in manufacturing or internationally traded services. Focus on export sales, research and innovation, productivity, starting up and scaling up, and regional enterprise. Responsible for Competence Centres Programme, but there are other RTDI agencies involved in Collaborative research funding.

  4. EI and the RTDI Landscape in Ireland Other RTDI Organisations;- • Industrial Development Agency (IDA) Ireland, concerned with FDI, play a significant role in funding collaborative R&D for their clients. 2 ) Science Foundation Ireland (SFI), concerned with scientific research in ICT and Biotechnology and more recently Energy. 3 ) Higher Education Authority (HEA), concerned with building the research infrastructure in the Third Level sector. Other bodies exist for research in health, agriculture, marine affairs etc

  5. Collaborative Models;- Ireland • HEA Programme for Research in Third Level Institutions (PRTLI), large capital and research grants to build capacity. • SFI Centres for Science Engineering and Technology (CSETs) are Academic led and owned with 15-20% matching Industry funds. • EI Industry Led Networks are 100% state funded research projects to an Industry set agenda.

  6. EI Competence Centres – policy and context Objective;- to increase business activity and expenditure in RTDI in order to generate economic benefit and sustainable growth Intervention;- required to address a market failure in high risk R&D. required to encourage collaboration between business and the research provision community. required to translate the training needs of industry into the research community. Ireland;- Government Strategy for Science Technology and Innovation document calls for the establishment of Competence Centres in Ireland.

  7. Activity – useful outputs of strategic research Pure basic research (Bohr) Use-inspired basic research (Pasteur) Yes Quest for fundamental understanding Risk of market failure Not a place you’d want to be… Pure applied research (Edison) No No Yes Considerations of use (market suitability) Source; Vinnova Analysis 2004. Focus of competence centres

  8. PRTLI Institute Eg Conway SFI CSET Scale Duration Impact COMPETENCE CENTRE Cluster HEA network Industry-led Network Academic Leadership Industry Leadership Industry-led Competence Centres Positioning;- Ireland

  9. Model – Industry Led Competence Centre Centre hosted at one site to give clear responsibility/critical mass Focussed on collaborative research, following an agenda set maintained and revised by industry Conducted under the direct supervision of an Industry Board with the power to change the direction of the research in line with market requirements Clear metrics including scale of collaborative research, technology transfers such as licences, spin outs, staff transfers (both to and from industry), and new companies brought into the consortium Led by a Technology Leader with strong industry background

  10. Model – Technology Leader A Technology Leader will be an Industrially experienced researcher, employed by a Host PRO, charged with delivering the Research needs of the Industry group associated with the Competence Centre. The Technology Leader award will be up to €1M annually, inclusive of overhead, for five years. There will be a significant breakpoint review in Year 3, which can recommend continuation, closure, reductions or extensions. Funding will be contingent on satisfactory performance, including review by the Industry Steering Board. The award is to cover the costs of the Technology Leader, a dedicated research team and associated costs.

  11. Technology Leader – Host – Competence Centre Hosting a Technology leader will require evidence ofthe following;- Track record in andclear access to relevant research competence in the Technology Area. Quick and flexible access to facilities, administrative supports and space. A clear methodology for working effectively with other relevant Institutions. Empowerment of an Industry Steering Board in setting, reviewing and amending the Research Agenda and in prioritisation and allocation of resources to projects addressing the Research Agenda. IP and Technology Transfer methods and policies that are aligned with the Competence Centres programme goals as evidenced by commercialisation of research. Alignment with Institutional Strategy.

  12. Metrics and Indicators As Best Practice a mix of top-down and bottom up metrics and indicators will be agreed and form part of the contract (revised annually) Metrics; number and scale of industrial collaborative research projects licences, spin outs, spin in staff transfers (both to and from industry), additional R&D activity within companies (BERD) new companies brought into the consortium. Indicators; Research Quality International Effect Management Quality Training outputs Soft Technology Transfers

  13. Previous Model - PATs • Multiple University Host locations – virtual networked Centres/units • Cost recovery targets year on year, with self-sustaining status as a target • Multiple part-time Research Directors, some permanent technical staff • Funding front loaded – falls away after time • Industry Advisory Board • Metrics were financial, with top down requirements added later • Used liaison with Academia to develop Technology Focus.

  14. PATs Experiences • Multiple University Host locations – attribution of results to specific Centres became difficult. Competition between locations for business !! • Cost recovery – drove Centres away from research towards training, consultancy and technical services • Part-time Directors did not necessarily engage or prioritize the Centre’s activities over host activities. Permanent staff made Centre closure problematic. • Funding front loaded – concentrated on expenditure on capital at the start and under-funded potential growth towards critical mass over time.

  15. PATs Experiences • Industry Advisory Board – was advisory only, this did not empower or engage industry and they drifted away. Centre’s revert to Host academic driven behaviour. • Metrics were financial – Centres focused on meeting financial target only – sought big and easy projects rather than meeting an overall industrial impact goal. Led to concentration on Framework projects and projects with MNCs. • Academic led Technology Focus did not always match the Industry landscape – the work was of more benefit to no EEA companies in some cases. • Management metrics were not used until much later – top down metrics were applied later – This creates difficulties – metrics should be shaped to mirror the growth and maturity of the Centres.

  16. CREST ILCC - Best Practice - Governance Governance model must reflect policy context and allow the Centre to develop its performance in line with policy goals. Leadership is key – chair, director and board must be capable of delivering true collaboration and effective management. Centre should develop and evolve, renewing the Research Agenda and the operational model over time. Qualified sub-committees are advised for IP management and Scientific/research excellence.

  17. CREST ILCC - Best Practice – Metrics and Indicators A mix of ‘top-down’ metrics and indicators reflecting policy goals and ‘bottom-up’ metrics and indicators reflecting a Centre’s own view of it’s success is recommended. Metrics and Indicators should evolve over time as the Centre develops and grows – expectations of this need to be managed from the outset. Best Practice selection of metrics and indicators should demonstrate the commitment of the stakeholders to collaboration. Metrics and indicators should be used in the management of Competence Centres Programmes – support success, correct failure.

  18. CREST ILCC - Best Practice – Financing and Sustainability The report recommended that prescribing ‘self-financing’ status as an aim for Competence Centres was not advisable. This is to maintain the correct type of research activity, addressing a market failure. Best Practice indicates that Centre Programmes should be long term but finite. The founding logic for a Centre will have shifted over time. In order to prevent ‘mission drift’ mechanisms should be set whereby funds are concentrated towards multi-lateral collaborative research projects over the period of operation of a Competence Centre. IP practice and EU State Aids must be given early consideration in Centre establishment.

  19. CREST ILCC - Best Practice – Training and Mobility The engagement of industry based researchers in Competence Centre activities was seen as key. Careful consideration must be given to the employment of student researchers as the Centre’s research activities and operations may not align with the student core curriculum of activity. Centres are excellent training grounds for researchers and a fertile source of new research talent for companies, and should be used as such. As best practice, research management issues should be addressed and Centre researchers developed as research managers where appropriate.

  20. Monitoring an Evaluation Plans The level of investment in Competence Centres Programmes means that overall Evaluations of Programmes generally consist of large scale independent reviews, involving in-depth interviews/research. Evaluations are therefore planned for Year 3 of operation and on Programme Closure. The mid-point evaluation is designed to offer three answers – wind-down, continue or re-scale. Focus is on the quality of management and activities, strong evidence of impacts and transfers to member companies and plans for the future. As the Programme Aim is BERD, the final evaluation will look back at this impact on member businesses and the wider sector – independent figures available for this annually (so we have a base-line)

  21. Monitoring an Evaluation Plans - Tools EI have full-time personnel based in the Host universities already, monitoring a range of projects and activities. Each Centre will be assigned one of these personnel as their ‘monitor’ Funding contracts are contingent on the delivery of satisfactory annual technical reports and financial claims. EI has observer status on the Industry Steering Board of the Centre- this means we are aware of all Board meetings and receive the minutes. Boards to meet 6-10 time per annum. EI fund directly some of the Technology Transfer Offices – their metrics are around commercialisation and will include Competence Centre metrics. Funding contracts oblige Hosts to seek consent to exploit IP

  22. Monitoring an Evaluation Plans Number and scale of Collaborative research projects will be monitored via the Board, and updates provided per meeting. Annual report by Technology Leader also required. Verification of the project activity can be assessed by the Agencies using the company single point of contact. MIS systems have been enhanced to merge the investment in R&D data from the Higher Education side to the direct company grant side. Evaluations will take sample case studies to verify validity of overall data reported. Audit function on a 5-10% sample

  23. Monitoring an Evaluation Plans Licences will be monitored via the Board, and updates provided per meeting. Annual report by Technology Leader also required. Verification of the licence activity can be assessed by the Agencies using the Exploitation Consent Clause. Committee meets monthly. Hosts must offer IP first to contributors, then to member and then wider – at market rates. Independent assessment for rates available. Evaluations will take sample case studies to verify validity of overall data reported. Audit function on a 5-10% sample

  24. Monitoring an Evaluation Plans Start-ups and Spin-ins will be monitored via the Board, and updates provided per meeting. Annual report by Technology Leader also required. Verification of the start-up activity can be assessed by the Agencies using the Technology Transfer Offices. Committee meets monthly. Attribution of the ‘cause’ of the start-up to be assessed as soon as practical. Spin-in is not an issue. Evaluations will take sample case studies to verify validity of overall data reported. Audit function on a 5-10% sample

  25. Monitoring an Evaluation Plans Staff Transfers will be monitored via the Board, and updates provided per meeting. Annual report by Technology Leader also required. Formal transfer can be verified via Host personnel offices, and registration of transfers in where on payroll. Less formally, timesheeting is a requirement. As Physical Centres – a register of attendance can be employed. Evaluations will take sample case studies to verify validity of overall data reported. Audit function on a 5-10% sample

  26. Monitoring an Evaluation Plans BERD will be monitored via an independent Agency. (Annually) New Members Company membership fees (at a nominal rate) are a requirement of the programme. Board to maintain a register of members.

  27. Indicators Research Quality – the Board and the Hosts are to agree to a framework of research quality metrics for published work. International Effect to be demonstrated by participation in EU Framework programme. This is managed via EI in any case. Management Quality – on Breakpoint Evaluation Training outputs – measure of effort, awards etc – captured on Evaluations. Soft technology transfer – not decided, ‘certificate’ scheme in mind.

  28. Industry-led Competence Centres THANK YOU FOR LISTENING DISCUSSION

More Related