1 / 46

Real-world impacts from research: Evidence & lessons

Real-world impacts from research: Evidence & lessons. David Pannell Centre for Environmental Economics and Policy School of Agricultural and Resource Economics. For this PPT see www.davidpannell.net under “Talks”. Growing interest.

kiara
Download Presentation

Real-world impacts from research: Evidence & lessons

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Real-world impacts from research:Evidence & lessons David PannellCentre for Environmental Economics and PolicySchool of Agricultural and Resource Economics For this PPT see www.davidpannell.net under “Talks”

  2. Growing interest • Perception: we need to do better at convincing government about benefits of research • ARC discussing how to include real-world impact in ERA • UK’s Research Excellence Framework: 20% of funding based on “impact” from 2014.

  3. Trial by universities, 2012 • Group of Eight (Go8) and AustTechnology Network of Universities (ATN) • Each university submitted cherry-picked case studies (165 submissions) • Evaluated by people from industry & government • 24 ‘best’ selected

  4. Plan • An example research project • Was selected in the GO8/ATN • Some evidence about impact • Measuring impact • Strategies for having impact

  5. Example

  6. 2000: Salinity was a hot topic

  7. $1.4 billion of public funding

  8. I was shocked • Poor design of the program • Program developers seemed to have been unaware of crucial areas of salinity research and their implications • No chance of any significant benefits

  9. My response • Media • Discussion papers • Presentations • Submissions

  10. Tried to help them • Developed INFFER (Investment Framework for Environmental Resources) • A tool for integrating the science with other info • Develop logical, evidence-based environmental projects • Assess value for money • Prioritise projects

  11. INFFER strategy • Extensive input by users • Make tools as simple as possible • Provide training and help desk for users • Readable documentation • Public critiques of existing approaches • Attempt to influence gov’t agencies to change the signals

  12. Regional NRM application

  13. Policy impacts • Senate inquiry (2006) • Recommended use of INFFER • NRM Ministerial Council (2007) • Endorsed new set of principles for investment in salinity • Victorian Government, Biodiversity White Paper • “INFFER will be utilised for the next five years”. • Caring for our Country • Influenced design of project template

  14. Lessons: Use of science • If you want people to use good science, the people issues are crucial • Relationships • Communication • Most prospective users were happy with current (very poor) approach • Didn’t perceive that government would reward them for doing it better

  15. Lessons: User capacity • Lack of capacity to formally integrate disparate technical and socio-economic information for decision making • Lack of expertise in economics and social science • Lack of time to read things • People misinterpret things easily

  16. versus? and? Research versus?Impact • Has taken considerable effort beyond traditional research • Time commitment • New skills and knowledge • New networks • Satisfying but very challenging to make a difference • Worth it?

  17. versus? and? Research versus?Impact • Various benefits for my research • Interesting problems and issues arise • Innovation - outside what’s currently in journals • Better understanding of research relevance • Journal papers generated • Directly part of the INFFER work: 17 • Related/stimulated by: 16 • Reputation for useful research  easier to get funding (unsolicited approaches offering $)

  18. Evidence about impact

  19. Evidence of high returns • Estimated rates of return to R&D are typically very high • Can indicate 30%, 50%, 100% annual rate of return • Credible? • $1 invested at 50% over 100 years = $4E17 (a million times Australia’s annual GDP) • Sound analyses still show good returns • For both applied and basic research

  20. Heterogeneity • The distribution of benefits is highly skewed • Most research has low impact • A small number of projects have huge impact • More than enough to pay for the rest

  21. Example: CRC program • Benefits for 1991 to 2017 • The CRC program generated a net economic benefit of $7.5 billion over the study period • Annual contribution of $278 million • BCR = 3.1

  22. Impact is often slow • Lags to impact usually measured in decades • e.g. US agriculture • From first investment to peak impact = 24 years • Still generating benefits after 50 years • Several lags • Research lag • Commercialisation lag • Adoption lag • Impact lag

  23. Longer lags = lower net benefits • Discounting  allowing for interest costs on the up-front investment • 30-year lag, 7% discount rate, benefits reduced by 87% • The high measured rates of return occur despite the long time lags

  24. Supply push vs demand pull • Science push (Bush, 1945) • Implicit in the “linear model” • Basic R  Applied R  Technology  Benefits • Demand pull (Schmookler, 1966) • Market demand Applied R Technology  Benefits • Big debate in the 1960s • Resolved in the 1970s – innovation is an iterative process – both push and pull matter

  25. Measuring impact

  26. Determinants of benefits • Scale of relevance • Adoptability of the research • Benefits per unit • Probability of research success • Share of the credit attributable to particular research • Time lags

  27. With vs without

  28. Applicability? • The theory is relatively straightforward • It has been applied successfully in many case studies • Especially agriculture

  29. But … • It takes resources and skills • Easier … • for physical products than for knowledge • if the benefits arise in markets • if the benefits occur quickly • for applied than for basic research • Much university research is not in the categories that are relatively easy to evaluate • Knowledge, public goods, long time lags, basic

  30. What will ERA do? • Perhaps copy the UK Research Excellence Framework • Two components • Case studies of impact • The submitting unit's approach to enabling impact from its research • They won’t expect an economic evaluation

  31. If it’s case studies, you’ll need to • Make the case/tell the story • Link elements in chain from research to impact • Provide evidence • Note: in Go8/ATN trial, many nominations did this poorly • The chain was incomplete • The evidence was weak/unconvincing • If you can do it well, you’ll stand out

  32. Having an impact

  33. How to have an impact? • There is little research about this • There are papers, but largely anecdotal • Some resources at end of PPT

  34. Chain from research to impact • The chain varies widely from case to case • Can have many links • Understanding the chain for your research helps you to • choose, design and deliver research for greater impact • communicate impact • provide evidence

  35. A chain from research to impact: Technology • Research and development • Sell the IP • Feasibility studies • Design • Manufacturing capacity • Finance • Marketing • Sales

  36. A chain from research to impact: Information for policy • Research • Something useful is learned (or isn’t) • New information influences policy (or doesn’t) • Policy change is implemented (or isn’t) • If policy aims to change behaviour, people respond as intended (or don’t) • Changes (relative to no research) result – social, environmental or economic benefits (or not)

  37. Risk of low benefits from research to influence policy • Nobody is listening • You lack credibility with the decision maker • The decision maker doesn’t understand • The new results are not different enough from what we already know • The decision depends more on other factors • The decision options have similar payoffs

  38. Lessons: having impact • Need some demand pull • Understand and respect potential users • Be prepared for opposition • Need perseverance, continual marketing • Need repetition – government has short memory • Seek a product champion

  39. Lessons: having impact • Need “absorptive capacity” in the organisation • The political circumstances need to be right. You can’t change ideological positions of govt. • Timing. Grasp opportunities. • Good communication • Simplicity, brevity, clarity • Avoid jargon, maths, complex graphs • Think about impact which choosing what to research

  40. Conclusion • We are going to be asked to demonstrate real-world impact • It’s not just about communicating what we do better – we may need to change what we do to have genuine impact • Pursuing impact is exciting and worthwhile but challenging – spinoff benefits for research • The earlier in the research process you start thinking about impact, the better

  41. Resources • Pannell, D.J. and Roberts, A.M. (2009). Conducting and delivering integrated research to influence land-use policy: salinity policy in Australia, Environmental Science and Policy 12(8), 1088-1099. • http://dpannell.fnas.uwa.edu.au/dp0803.htm • Pannell, D.J. (2004). Effectively communicating economics to policy makers. Australian Journal of Agricultural and Resource Economics 48(3), 535-555. • http://dpannell.fnas.uwa.edu.au/j78ajare.pdf

  42. Resources • Weible et al. (2012). “Understanding and influencing the policy process”, Policy Science 45, 1-12. • http://link.springer.com/article/10.1007%2Fs11077-011-9143-5

  43. Pannell Discussions (Blog posts) • 150 – Why don’t environmental managers use decision theory? • http://www.pannelldiscussions.net/2009/04/150-why-dont-environmental-managers-use-decision-theory/ • 136 – Engaging with policy: tips for researchers • http://www.pannelldiscussions.net/2008/09/136-engaging-with-policy-tips-for-researchers/

  44. Resources • A relevant blog post by ecologist Brian McGill on “What it takes to do policy-relevant science” • http://dynamicecology.wordpress.com/2013/05/14/what-it-takes-to-do-policy-relevant-science/ • Video: Ben Martin (U Sussex) “Science Policy Research - Can Research Influence Policy? How? And Does It Make for Better Policy?” • http://upload.sms.csx.cam.ac.uk/media/747324

  45. For this PPT see www.davidpannell.net under “Talks”

More Related