1 / 11

How do we know CETL outputs have effects?” Murray Saunders University of Lancaster

How do we know CETL outputs have effects?” Murray Saunders University of Lancaster UK CETL Evaluation Framework Consultation 28 July 2009 National College for School Leadership Nottingham. Discussion. What is the focus for a final self evaluation report Levels of effect

min
Download Presentation

How do we know CETL outputs have effects?” Murray Saunders University of Lancaster

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. How do we know CETL outputs have effects?” Murray Saunders University of Lancaster UK CETL Evaluation Framework Consultation 28 July 2009 National College for School Leadership Nottingham

  2. Discussion What is the focus for a final self evaluation report Levels of effect Indicators of effect Example: CETL sponsorship of an R&D project Some issues

  3. Broad purposes and focus: an integrated approach “Evaluation is the purposeful gathering, analysis and discussion of evidence from relevant sources about the quality, value, worth and effect of provision, development or policy” • Evaluation for knowledge and learning (e.g. obtaining a deeper understanding in the organization and practice of teaching and learning) • Evaluation for development (e.g. providing evaluative help to strengthen sectors, institutions, departments, courses or projects) • Evaluation for accountability (e.g. measuring results, effects or efficiency)

  4. Effects and levels of focus in a final report Level 1 : Quality of the experience of the intervention [teaching quality, management quality etc] Level 2 : Quality of the situated learning outcomes [in the case of technology or knowledge transfer] Shifting from volume and descriptions to effects Level 3 : Reconstructed learning to new environments and practices, embedded effects Level 4 : Quality of institutional or sector effects Level 5 : Impact on macro or long term strategic objectives

  5. A methodology for identifying ‘effects’

  6. Starting points • Identify and mapping the core mechanisms and their outputs (outputs as process, as designs, as resources, as changed practices, as thinking: different implications for how to estimate value • Understanding effects as ‘change in recurrent practice’ • Systems and protocols (e.g. assessment practices) • Teachers practices (e.g. pedagogic practice and reflection) • Students learning practices (e.g. research project teams)

  7. Indicators of effects • Mode 1: Indicators interpreted as areas, activities, domains or phenomena on which evidence will be collected (open) • Mode 2: Indicators interpreted as the evidence itself (retrospective) • Mode 3: Indicators as a pre-defined or prescribed state to be achieved or obtained. In this way indicators constitute desired outcomes (closed and prospective)

  8. What counts as an indicator of effect? • CETL outputs as resources for development (what:): • When, where and to whom (identifying who uses CETL outputs and under what circumstances) • Enabling indicators (new policies, new people, new spaces, new courses: focus on use) • Process indicators (new change practices [how to bring about changes], new attitudes, new cultures, e.g. networks) • Outcome indicators (emergent recurrent practices, new resources, the use of outputs)

  9. CETL sponsorship of an R&D project • Identifying key stakeholders (who) • Focus on ‘enabling processes’ (how: evidence on recipients’ experience, other stakeholders, lessons for future development: interviews, narratives, vignettes) • R&D projects profile ( how: evidence on types, approaches adopted in the projects) • Nature of R&D outputs (what: documentary evidence: typologies e.g. curricula, teaching activities, approaches, research findings etc) • Use of R&D outputs (what: emerging recurrent practices): • use of outputs in recipients’ practices • use of outputs in colleagues’, students’, institutions’, sector practices

  10. Issues • The ‘chimera’ of certainty and difficulties of establishing lines of determination: diagnostic function: does it work, does it fit our values and vision of teaching and learning • Indicative and evocative: courtrooms not laboratories • What counts as evidence? • Narratives and vignettes • Comparisons, scope and the quantitative dimension • Unintended and unanticipated effects • Audiences and use

More Related