1 / 47

Nadia Nardi – Deputy Director

Nadia Nardi – Deputy Director. Impact of e-Infrastructures: Theories and Practices of the ERINA+ Assessment Methodology. EGI Technical Forum 2012. AGENDA.

sellers
Download Presentation

Nadia Nardi – Deputy Director

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Nadia Nardi – Deputy Director Impact of e-Infrastructures: Theories and Practices of the ERINA+ Assessment Methodology EGI Technical Forum 2012

  2. AGENDA ERINA+ Methodology(F. Bellini)ERINA+ Data Gathering Tools (Platform)(F.Bellini, A.Passani, J.Benedikt)(1) Project Self-Assessment Webtool(2) Project Users’ Questionnaire(3) Key Stakeholders QuestionnairePreliminary Results (A. Passani)Q&A/Open Discussion (A. Passani) EGI-TF

  3. The ERINA+ Methodology Francesco Bellini

  4. Methodological Workflow

  5. Literature Review (1) • About the definition of e-Infrastructures • Many “holistic” definitions • Which are the boundaries? • Which is the structure? • For sure are a “value network” (Allee 2008) with Roles, Transactions and Deliverables

  6. Literature Review (2) • Socio-economic assessment methodologies • Ex-ante Evaluation Methodologies • Monitoring and Ex-post Evaluation methodologies • Statistical data analysis • Modelling methodologies • Qualitative and semi-quantitative methodologies

  7. Literature Review (3) • Socio-economic Assessment of Research Infrastructures should look for • R&D efficiency and S/T knowledge base • Economic performance and productivity growth • Quality of human resources • Social cohesion • Scientific knowledge and technological diffusion • Employment • Transaction costs • Quality of life

  8. Impact Assessment Value Chain

  9. …From Literature to ERINA+ • e-Infrastructures Impact Value Chain 9 Prague, 19.09.2012 EGI-TF

  10. Goals of ERINA + Impact Assessment • Efficiency • Efficacy • Competitiveness and Excellence of Research • Innovation and Transfer outside the domain • Cohesion

  11. ERINA+ Workflow – Value Chain

  12. Mapping (Block 1)

  13. Stakeholder Perception Analysis (Block 2)

  14. Project’s Performance (Block 3) • Four steps process • Research typology identification • Definition of benefit indicators • Benefits measurement • Project final assessment

  15. Step 1 - Research typology identification Aim: To understand which of the layers the EC uses to divide the e-Infrastructures program, the project belongs to • Networking layer • Computing Layer: Distributed computing and PRACE • Data Layer • Simulation Software and Services

  16. Step 2 – Definition of benefit indicators Aim: To identify a list of benefit indicators allowing to measure both project’s effectiveness in fulfilling the targets, both project output economic efficiency • A common standard basic framework for measuring: • Efficiency • offered • perceived • Effectiveness • competitiveness & excellence of research • innovativeness of research & transfer outside the domain • cohesion

  17. Output classes Indicators of "offered efficiency" Proxy Source of information Networking Traffic served by the project output/year w.t.p. for a unitary increment in bandwidth Market of networking services Traffic served without the project output/year Computing CPU capacity provided by the project output/year w.t.p. for a unitary increment in CPU capacity Market of computing services CPU capacity available without the project output/year Storage Storage capacity provided by the project output/month w.t.p. for a unitary increment of storage capacity Market of storage services Storage capacity without the project output/month Simulation Software and Service N° of users served/year ERINA+ webtool Services for: coordination, data management, middleware, support Cost Saving (€/year per user) Time Saving to Access (hours/year per user) Time Saving for Doing (hours/year per user) Willingness to Pay (€/year per user) Step 3 - Benefits measurement Proxies for perceived efficiency

  18. Step 4 - Project final assessment (1)

  19. Step 4 - Project final assessment (2)

  20. Step 4 - Project final assessment (3)

  21. Social Network Analysis (Block 4)

  22. ERINA+ Database Aggregate Impact Analysis PROJECTS AGGREGATED ANALYSIS NET PRESENT VALUE INFO ABOUT USERS PUBLICATIONS INTELLECTUAL PROPERTY RIGHTS (IPR) EMPLOYMENT AND GENDER MAINSTREAMING SPIN-OFFS AND START-UPS THE IMPACT ON ERA STAKEHOLDER PERCEPTION ANALYSIS COLLABORATIONS AND SOCIAL CAPITAL

  23. Blocks contribution to assessment Aggregated analysis Block 1 Block 1 Block 2 Block 2 Block 3 Block 3 Block 4 Block 4

  24. Project Self-assessment Tool Stakeholders (including users) Perceived efficiencyEffectiveness Program Managers Offered efficiencyContribution to effectiveness Project Managers e-Infrastructures assessment dashboard The ERINA+ Platform ERINA+ DB • Efficiency • offered vs. perceived • Effectiveness: • Competitiveness & Excellence of research • Innovativeness of research & transfer outside the domain • Cohesion 24

  25. Why the Platform? • To create a dashboard for • project managers (self assessment) • the program managers (decision support) • the stakeholders (informative) • A repository for enabling qualitative and policy analysis • To make ERINA+ sustainable after its end

  26. ERINA+ Data Gathering Tools Francesco Bellini, Antonella Passani, Josef Benedikt

  27. Motivation • A singletoolboxdoesn‘texist! • A mix of RTD evaluationtoolsisdevelopedallowingfordynamicdatacollectionandaggregateanalysisusing different mediachannels • Aggregate Analysis iscombinedwith a weightingsystem on impactindicators • Resultsaremadeaccessibletodecision/policymakers • The ERINA+ platformoffers – online - • Data collectiontoolsorvariousstakeholdergroups via ERINA+WebTool • Self-Assessment Tool/ Dynamic Weighting • PlatformforDecisionMakers on AggregatedResults

  28. ERINA+ Data Gathering Tools # 1 - Project Self-Assessment Webtool # 2 - Project Users’ Questionnaire # 3 - Key Stakeholders Questionnaire

  29. projects.erinaplus.eu To obtain quantitative and qualitative data and information from projects. To collect data on ERINA+ indicators for socio-economic development To provide self-assessment page for projects To develop a dynamic online tool on main impact indicators of efficiency and effectiveness To support the ERINA+ methodology with advanced online, dynamic self-assessment -Information is growing by the amount of data projects.erinaplus.euONLINE

  30. ERINA+ Data Gathering Tools # 1 - Project Self-Assessment Webtool # 2 - Project Users’ Questionnaire # 3 - Key Stakeholders Questionnaire

  31. To obtain qualitative feedback from the broader e-Infrastructures communities. To collect data on various issues representing transformations of scientific workplace (skills, training, jobs..) To support the ERINA+ methodology with semi-quantitative questions users.erinaplus.eu Users.erinaplus.euONLINE

  32. ERINA+ Data Gathering Tools # 1 - Project Self-Assessment Webtool # 2 - Project Users’ Questionnaire # 3 - Key Stakeholders Questionnaire

  33. Stakeholder Perception Analysis Serves as a link between quantitative assessments on a project level and qualitative assessments of key stakeholders. Adds value the project assessment by adding a perception perspective on ERA issues Provides additional information on European e-Infrastructures to be integrated in the assessment report (D.3.1) and in the ERINA+ dashboard. Reports both qualitative and quantitative results in the assessment report as a dedicated chapter. Sets the stage on policy recommendation for improving both usage and exploitation of e-Infrastructures as a means supporting transformational processes in science and society at large.

  34. SPA–Matrix Question Development

  35. To obtain qualitative feedback from key stakeholders, national organisations. To collect data and information on various issues of the visibility of e-Infrastructures Research infrastructure Society at large To support the ERINA+ methodology with semi-quantitative estimation on key issues of e-Infrastructures (interfaces, workflow, spatial and technological visibility etc.) To support Policy Makers with evaluations on e-Infrastructure investments stakeholders.erinaplus.eu stakeholders.erinaplus.euONLINE

  36. Assessment Page

  37. Assessment Page

  38. Assessment Page

  39. ERINA+ Platform Data Collection / Assessment Status Antonella Passani

  40. Data Gathering Status • Projects data gathering: finalised • Users assessment: ongoing for few projects • Stakeholders perception analysis: ongoing • We are now analysing the information gathered so far • The ERINA+ webtools are online and can be used by projects and users without restrictions. EGI Technical Forum

  41. Response Figures • 21 projects filled-in the webtool • 3 of them are Support Actions • 102 users evaluated the projects’ outputs • 1 stakeholder provided its opinion on e-Infrastructures benefits, challenges and future developments EGI Technical Forum

  42. Project Direct Users

  43. Competitiveness and Excellence of Research Analysed projects produced: • 114 papers with impact factor published • 62 papers without impact factor and 96 articles in conference proceedings. • 54 IPRs different from patents

  44. Innovation and Transfer Outside the Domain • Up to August 2012, projects performed 213 dissemination activities that involved more than 246.000 persons in Europe and outside including United States, Latin America and China. • 4 spin-offs

  45. Cohesion • 143 cases of active collaboration among projects • In 8 cases, one project provides support to another one • In 34 cases, projects contribute to expand, improve, optimize, review or standardize existing e-Infrastructures • In 28 cases, projects reuse hardware, software, middleware, algorithms and functionalities of existing e-Infrastructures.

  46. Where we are now…-The ERINA+ methodology has been approved-The webtool has been successfully tested and onlineWhat is left to do…-Aggregated data analysis-Personalized report for projects who collaborated with us

  47. Thank-you for yourattention ! • Q & A • Open Discussion

More Related