Promoting and assessing research impact

Promoting and assessing research impact PowerPoint PPT Presentation


  • 167 Views
  • Uploaded on
  • Presentation posted in: General

UKES stream on policy and practice impacts of evaluation. Theoretical and practical aspects of how policy and practice impact come about and can be optimisedHow such impacts can themselves be evaluated and assessed. Research Unit for Research Utilisation (RURU) Sandra Nutley, Huw Davies, Tobias Ju

Download Presentation

Promoting and assessing research impact

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


1. Promoting and assessing research impact Sandra Nutley

2. UKES stream on policy and practice impacts of evaluation Theoretical and practical aspects of how policy and practice impact come about and can be optimised How such impacts can themselves be evaluated and assessed

3. Research Unit for Research Utilisation (RURU) Sandra Nutley, Huw Davies, Tobias Jung, Isabel Walter, Joanne McLean, Sarah Morton, Joyce Wilkinson, (+ Ruth Levitt, Catherine Lyall, Laura Meagher, Bill Solesbury)

4. Using Evidence: How research can inform public services (Nutley, Walter and Davies, Policy Press, 2007)

7. Conducive policy environments

8. Evidence use is complex – because ‘policy making’ is complex SOMETIMES: clearly defined event explicit decisions involving known actors conscious deliberation defined policies policy fixed at implementation The rational high ground OFTENTIMES: ongoing process piecemeal: no single decision many actors muddling through policies emerge and accrete shaped through implementation The swampy low ground

9. Challenges of linking two worlds Divergent concerns, priorities, incentives, language, dynamics conceptions of knowledge and time-scales status and power Leading to communication difficulties mismatch between supply and demand rejection and implementation failure

10. But many active players in policy networks

11. Improving evidence use: addressing supply, demand, and that in between Much of the thinking about how to improve research use in public policy has focused on ways of improving the supply of evidence, ways of increasing the demand for evidence and mechanisms to improve the connection between these two (such as the intermediation offered by knowledge brokers and think tanks). This is helpful to some extent but it still contains the assumption that basically we are dealing with two separate communitiesMuch of the thinking about how to improve research use in public policy has focused on ways of improving the supply of evidence, ways of increasing the demand for evidence and mechanisms to improve the connection between these two (such as the intermediation offered by knowledge brokers and think tanks). This is helpful to some extent but it still contains the assumption that basically we are dealing with two separate communities

12. Lesson 1: Improve the supply of relevant, accessible & credible evidence… but don’t stop there R&D strategies Improving capacity (and incentives) internally and externally Research commissioning processes Supporting synthesis of existing studies Better dissemination and archiving

13. Lesson 2: Shape – as well as respond to – the demand for evidence in policy and practice settings Government commitment to an evidence-informed approach Improve analytical skills of policy makers and practitioners Address incentives (For external researchers) work with advocacy organisations to shape context for specific policies

14. The importance of context Interaction with other types of knowledge (tacit; experiential) Multi-voiced dialogue ‘Use’ as a process not an event Lesson 3: Develop multifaceted strategies to address interplay between supply & demand

15. Generic features of effective practices to increase research impact Research must be translated - adaptation of findings to specific policy and practice contexts Enthusiasm- of key individuals - personal contact is most effective Contextual analysis - understanding and targeting specific barriers to, and enablers of, change Credibility - strong evidence from trusted source, inc. endorsement from opinion leaders Leadership - within research impact settings Support - ongoing financial, technical & emotional support Integration - of new activities with existing systems and activities

16. Lesson 4: Recognise role of dedicated knowledge broker organisations/ networks Three brokerage frameworks Knowledge management (facilitating creation, diffusion and use of knowledge) Linkage and exchange (brokering the relationship between ‘creators’ and ‘users’) Capacity building (improving capacity to interpret and use evidence – and to produce more accessible analytical reports) Based on Oldham and McLean 1997

17. Lesson 5: Target multiple voices to increase opportunities for evidence to become part of policy discourse Feeding evidence into wider political debate – informed public debate Deliberative inquiry, citizen juries, etc More challenging approach for governments – ‘letting go’ More challenging role for researchers and research advocates – contestation and debate

18. Lesson 6: Evaluate strategies to improve evidence use and learn from this Rarely done in any systematic way What works, for whom and in what circumstances

20. Why assess impacts? Because we are increasingly asked to do this but… Who are the key stakeholders, and why do they want this information? Is assessment for summative or formative purposes? What impacts are desired/expected/reasonable?

21. What counts as use/impact? Instrumental use Conceptual use Symbolic use

22. What impacts should be assessed? What types of use/impacts are of interest? e.g. instrumental, conceptual, symbolic? Are we interested in outputs (what is produced) processes (extent and nature of research use) or outcomes (actual consequences of that use)? Can we track all impacts: expected/unexpected? (and what about ‘unfortunate uptakes’ and other dysfunctional responses?)

24. How should impacts be assessed? By tracking forwards from specific findings to use and actual/potential implications of such use? E.g. Payback model 5 categories: Knowledge production Research capacity building Policy or product development Sector benefits Wider societal benefits By tracking backwards from decisions or practice behaviours to (research-based) influences on these? E.g User panels

25. Where, who and when? How wide to cast the net? Impact often occurs far down the line

27. Common conclusions on methodology Case study methods to take account of differing types of research and contexts for impact Combination of quantitative and qualitative methods and indicators within case study approach Need for research use/ impact model to guide data collection and analysis

29. Specific challenges Case sampling – uneven and maybe misleading Getting away from linear models – RU/ RI often non-linear and highly mediated Attribution and additionality – constructing a convincing impact narrative (cf contribution analysis) Receptivity of context – assessing actual or potential impacts?

30. Key messages In developing impact strategies, use the existing evidence on ways of improving evidence use In evaluating impact of research and evaluation Need realistic assumptions about the nature and processes of research use and impact– these are many and complex No single model of research use is likely to be sufficient for all situations when impact is to be assessed; Need to make choices about where and how to look for impacts based on purpose of impact assessment

31. Some references ESRC (2009) Taking Stock: A summary of ESRC’s work to evaluate the impact of research on policy and practice, http://www.esrcsocietytoday.ac.uk/takingstock Davies HTO and Nutley SM (2008) ‘Learning More about How Research-Based Knowledge Gets Used Guidance in the Development of New Empirical Research’, Working Paper for the WT Grant Foundation, New York Meagher L, Lyall C and Nutley S (2008) ‘Flows of knowledge, expertise and influence: a method for assessing policy and practice impacts from social science research’ Research Evaluation 17(3): 163-173 Boaz et al (2008) ‘Assessing the impact of research on policy: A review of the literature’, Kings College London/ PSI

  • Login