1 / 21

Researching and evaluating equity initiatives Evaluating WP Initiatives: Overcoming the Challenges

Researching and evaluating equity initiatives Evaluating WP Initiatives: Overcoming the Challenges Open University, 28 February 2019 Annette Hayton, University of Bath. Accountability, context & impact.

talli
Download Presentation

Researching and evaluating equity initiatives Evaluating WP Initiatives: Overcoming the Challenges

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Researching and evaluating equity initiatives Evaluating WP Initiatives: Overcoming the Challenges Open University, 28 February 2019 Annette Hayton, University of Bath

  2. Accountability, context & impact • Research has increased understanding of reasons for low participation and attainment of under-represented groups but: • Descriptive – not focussed on making a difference • Often not disseminated to practitioners or policy makers • Hasn’t informed planning, evaluation and monitoring • Monitoring for OfS, SMTs and Government has focussed on: • value for money • demonstrating the effectiveness of WP interventions • Practitioner research/evaluation has focussed on: • the successful delivery of activities • reporting to OfS, funders and SMT

  3. Accountability, context & impact BUT efforts for accountability, ‘rigour’ & comparability OFTENresult in simplistic approaches to evaluationbased on medical models LOSE SIGHT of underlying reasons for inequalities OF CONTEXT & complexity of successful interventions Pressure to demonstrate: • the effectiveness of interventions and value for money • Not unreasonable !

  4. Accountability, context & impact : Picciotto warns against the ‘lure of the medical model’, ‘Experimental black boxes are poorly suited to the evaluation of complicated or complex programmes in unstable environments’ (Picciotto, 2012: 223) a field called translational science has been invented to concentrate on bridging laboratory finding with clinical experience.’ . Fendler (2016) ‘RCTs often premised on students having a problem or ‘symptoms’ that require treatment … these students are pathologised first by naming their problem (often expressed in terms that match the solutions at hand) and then by being treated with an intervention by some external agency or person. (Gale, 2017: 4)

  5. Accountability, context & impact How can we assess effectiveness of interventions? ‘what works’ is a matter of judgement rather than data, and that this judgement is imbued with moral and ethical concerns’ Morrison, 2001: 79). Copestake argues for measurement based on the notion of ‘reasonableness’, involving a range of stakeholders ‘This falls short of scientific certainty, but in complex situations it is often as much as we can hope for ...... to aim higher may be counterproductive in terms of cost, timeliness and policy relevance.’ (Copestake, 2014: 417) Decade-long debate within the Development Evaluation community Reachedan uneasy consensus that a mixed-methods approach was required. Picciotto’s (2012: 215–16)

  6. Accountability, context & impact Widening participation work is, or at least should be, based on the personal. .... in which young people are enabled to make choices and decisions, develop strategies and goals, plan for their futures, and are motivated, inspired and empowered. (Hayton and Stevenson 2018). Nygaard and Belluigi (2011) argue that: decontextualized approaches to evaluating learning and teaching are rooted in a static conception of learning ....... more creative and flexible pedagogies are required and a contextualized model of evaluation that ‘stress that relations between individual and fellow students, teachers, administration are determined by context

  7. Mixed Methods Monitoring Progression outcomes ? Impact of activities Process Evaluation

  8. Understanding the challenge Theories of change • Important for the sector to move beyond descriptive research to action • BUT • Understanding the processes involved in bringing about the change Theories of change currently being presented too simplistic and linear • Defining your interventions • Determining the impact of your work

  9. Effective theory of change • Aims for interventions informed by theory, research and practice • Interventions reflect the aims • Appropriate methods used to generate useful data • Evidence to demonstrate impact and inform practice and theory

  10. The NERUPI Framework Designed to maximise the impact of Widening Participation interventions providing: • a robust theoretical and evidence-based rationale for the types of intervention that are designed and delivered • clear aims and learning outcomes for interventions, which enable more strategic and reflexive design and delivery • an integrated evaluation process across multiple interventions to improve data quality, effectiveness and impact

  11. Key theoretical influences Nancy Fraser on social justice Sen and Walker’s concepts of capability Yosso cultural wealths Identities and possible/future selves Young and Maton’s ideas of knowledge Critical pedagogies

  12. Making a difference praxis • Theory & academic research – quantitative and qualitative • Practice • Policy reflection and action directed at the structures to be transformed Paulo Freire 1968

  13. Context & the field of HE progression

  14. Why do some students do well? • Economic capital • Social capital -who you know • Cultural capital – what you know Pierre Bourdieu Resource differences and collective efforts and investments made or not within families become translated into individual ‘ability’........ (Ball 2010, p.162).

  15. The NERUPI Framework

  16. Aims and Objectives Level 3

  17. Aim, objective & learning outcome

  18. Action research reflective cycle for WP ANALYSIS theory - OfS policy – local context - data - knowledge PLANNING aims - targeting - interventions - evaluation strategy- logisitics COLLECT DATA Monitoring – tracking – related stats – process - impact ACTION Deliver the interventions ANALYSIS Cycle repeats

  19. NERUPI Framework • A set of Aims and Objectives for interventions informed bytheory, research and practice • Can encompass specific intervention-based aims • A common language for planning and reporting • Choice of appropriate methods according to context of intervention • Evidence to demonstrate impact and inform practice and theory

  20. Find out more: www.nerupi.co.uk NERUPI Members Event 11 March 2019 The Capability Approach: Beyond the Deficit Model for Student Success NERUPI Open Event 14 June 2019 Introduction to the NERUPI Framework

  21. References • Fraser, N. (2003) ‘Social justice in the age of identity politics: Redistribution, recognition, and participation’. In Fraser, N. and Honneth, A. Redistribution or Recognition? A political- philosophical exchange. Trans. Golb, J., Ingram, J. and Wilke, C. London: Verso, 7–109.31 Praxis-based frameworks • Freire, P. (1972) Pedagogy of the Oppressed. Trans. Ramos, M.B. Harmondsworth: Penguin Books. • Copestake, J. (2014) ‘Credible impact evaluation in complex contexts: Confirmatory and exploratory approaches’. Evaluation, 20 (4), 412–27. • Fendler, L. (2016) ‘Ethical implications of validity-vs-reliability trade-offs in educational research’. Ethics and Education, 11 (2), 214–29. • Gale, T. (2017) ‘What’s not to like about RCTs in education?’. In Childs, A. and Menter, I. (eds) Mobilising Teacher Researchers: Challenging educational inequality. London: Routledge, 207–23. • Hayton, A. and Bengry-Howell, A. (2016) ‘Theory, evaluation, and practice in widening participation: A framework approach to assessing impact’. London Review of Education, 14 (3), 41–53. • Morrison, K. (2001) ‘Randomised controlled trials for evidence-based education: Some problems in judging • Nygaard, C. and Belluigi, D.Z. (2011) ‘A proposed methodology for contextualised evaluation in higher education’. Assessment and Evaluation in Higher Education, 36 (6), • Picciotto, R. (2012) ‘Experimentalism and development evaluation: Will the bubble burst?’. Evaluation, 18 (2), 213–29.

More Related