Testing the validity of indicators in the field of education
This presentation is the property of its rightful owner.
Sponsored Links
1 / 18

Testing the validity of indicators in the field of education The experience of CRELL PowerPoint PPT Presentation


  • 88 Views
  • Uploaded on
  • Presentation posted in: General

Testing the validity of indicators in the field of education The experience of CRELL Rome, October 3-5, 2012, Improving Education through Accountability and Evaluation: Lessons from Around the World Andrea Saltelli European Commission, Joint Research Centre. CRELL

Download Presentation

Testing the validity of indicators in the field of education The experience of CRELL

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Testing the validity of indicators in the field of education the experience of crell

Testing the validity of indicators in the field of educationThe experience of CRELL

Rome, October 3-5, 2012, Improving Education through Accountability and Evaluation: Lessons from Around the World

Andrea Saltelli European Commission, Joint Research Centre


Testing the validity of indicators in the field of education the experience of crell

CRELL

Centre for research on lifelong learning based on indicators and benchmarks

DG Education and Culture+Joint Research Centre, since 2005

http://crell.jrc.ec.europa.eu/


Testing the validity of indicators in the field of education the experience of crell

Focus

Foci of econometric research at the JRC, Ispra

  • CRELL

  • Trajectories to achieving EU 2020 objs.

  • Employability and other benchmarks (mobility, multi-linguism)

  • Labour market outcomes


Testing the validity of indicators in the field of education the experience of crell

Focus

Foci of econometric research at the JRC, Ispra

Counter Factual analysis and other

Impact Assessment methodologies

Regional Studies (com…petitiveness, innovation, well being)

Composite indicators and social choice

20 August 2014

4


Testing the validity of indicators in the field of education the experience of crell

Indicators


Testing the validity of indicators in the field of education the experience of crell

Context:Knowledge in support to policy; evaluation and impact assessment, but also advocacy

Caveat: Validity = plausibility, defensibility … and not ‘proof of truth’


Testing the validity of indicators in the field of education the experience of crell

Sensitivity analysis

When testing the evidence some reasonable people (and guidelines) suggest that ‘sensitivity analysis would help’.

JRC fostered sensitivity analysis development and uptake (20 years of papers, schools and books).

Today we call it sensitivity auditing and teach it within the syllabus for impact assessment run by the SEC GEN.


Testing the validity of indicators in the field of education the experience of crell

Sensitivity analysis

How coupled stairs are shaken in most of available literature

How to shake coupled stairs


Testing the validity of indicators in the field of education the experience of crell

Sensitivity analysis

Testing (composite) indicators: two approaches

Michaela Saisana, Andrea. Saltelli, and Stefano Tarantola (2005). Uncertainty and sensitivity analysis techniques as tools for the quality assessment of composite indicators. J. R. Statist. Soc. A168(2), 307–323.

Paolo Paruolo, Michaela Saisana, Andrea SaltelliRatings and rankings: Voodoo or Science?, J. R. Statist. Soc. A, 176 (2), 1-26


Testing the validity of indicators in the field of education the experience of crell

Sensitivity analysis

First: The invasive approach

Michaela Saisana, Béatrice d’Hombres, Andrea Saltelli, Rickety numbers: Volatility of university rankings and policy implications

Research Policy (2011), 40, 165-177


Robustness analysis of sjtu and thes

Robustnessanalysisof SJTU and THES


Sjtu simulated ranks top20

SJTU: simulatedranks – Top20

  • Harvard, Stanford, Berkley, Cambridge, MIT: top 5 in more than 75% of our simulations.

  • Univ California SF: original rank 18th but could be ranked anywhere between the 6th and 100th position

  • Impact of assumptions: much stronger for the middle ranked universities


Thes simulated ranks top 20

THES: simulatedranks – Top 20

  • Impact of uncertainties on the university ranks is even more apparent.

  • M.I.T.: ranked 9th, but confirmed only in 13% of simulations (plausible range [4, 35])

  • Very high volatility also for universities ranked 10th-20th position, e.g., Duke Univ, John Hopkins Univ, Cornell Univ.


Testing the validity of indicators in the field of education the experience of crell

Sensitivity analysis

Second: The non-invasive approach

Comparing the weights as assigned by developers with ‘effective weights’ derived from sensitivity analysis.


University rankings

University Rankings

Comparing the internal coherence of ARWU versus THES by testing the weights declared by developers with ‘effective’ importance measures.


Testing the validity of indicators in the field of education the experience of crell

JRC fosters the development of good practices for the construction of aggregated statistical measures (indices, composite indicators).

Partnerships with OECD, WEF, INSEAD, WIPO, UN-IFAD, FAO, Transparency International, World Justice Project, Harvard, Yale, Columbia …

Sixty analyses (Michaela Saisana, JRC)


Testing the validity of indicators in the field of education the experience of crell

Something worth advocating for (1):

More use of social choice theory methods both for building meaningful aggregated indicators …

(A pity that methods already available between the end of the XIII and the XV century are neglected by most developers)

… they could be used more also in comparing options in the context of impact assessment studies.  course at JRC Ispra October 11-12


Testing the validity of indicators in the field of education the experience of crell

Useful links:

Econometrics and Applied Statistics Unit

http://ipsc.jrc.ec.europa.eu/?id=155

Sensitivity Analysis:

http://sensitivity-analysis.jrc.ec.europa.eu/

Sensitivity Auditing: http://sensitivity-analysis.jrc.ec.europa.eu/Presentations/Saltelli-final-February-1-1.pdf

Quality of composite indicators: http://ipsc.jrc.ec.europa.eu/index.php?id=739


  • Login