1 / 46

Knowledge mobilization, setting the context: systematic research synthesis

2005 CESC-SSHRC Symposium Ottawa, 25 th May 2005. Knowledge mobilization, setting the context: systematic research synthesis. David Gough Social Science Research Unit, Institute of Education, London, UK. Social Science Research Unit. Childhood Studies Evaluation of Social Interventions

melina
Download Presentation

Knowledge mobilization, setting the context: systematic research synthesis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2005 CESC-SSHRC SymposiumOttawa, 25th May 2005 Knowledge mobilization, setting the context: systematic research synthesis David Gough Social Science Research Unit, Institute of Education, London, UK

  2. Social Science Research Unit • Childhood Studies • Evaluation of Social Interventions • Sexual Health, Reproduction and Social Exclusion • Evidence for Policy and Practice Information and Co-ordinating (EPPI) Centre • Perspectives, Participation and Research

  3. EPPI vision (1) • A process of accumulating knowledge about key policy and practice issues • Which involves citizens (different groups of stakeholders) at all stages (setting the question… interpreting and disseminating the findings) • Which is held in accessible formats

  4. EPPI vision (2) • Which is free from ‘technical’ language • Which results in usable and useful ‘evidence’ • Which is subject to constant reflection and development • Which enables citizens to become experts

  5. Need for synthesis • Brings together what we know – whatever you are studying • Contextualizes information from new studies • Involves explicit systematic methods and thus transparency • These may be lacking in non systematic reviews and expert opinion however excellent

  6. Key features of a systematic review • Synthesises the results of primary research • Uses explicit and transparent method • A piece of research, following standard set of stages • Accountable, replicable, updateable • Need for user involvement

  7. Question led synthesis • Questions looking for answers • Make implicit assumptions explicit • All types of question so all types of research design • Statistical, narrative empirical and conceptual synthesis • Mixed methods synthesis

  8. Not just any reviews: 6 reviews of older people and accident prevention Total studies included 137 Common to at least two reviews 33 Common to all six reviews 2 Treated consistently in all reviews 1 Oliver et al. 1999

  9. User question led synthesis • What do we want to know? • Who wants to know and why? • What do we know and how do we know it? • What more do we need to know and how can we know it? • Can not be value free • Involves intellectual work

  10. Dimensions of difference in synthesis models • Review/research questions (impact, process, need, explanatory concepts) • Research designs considered relevant • Types of data - numerical or textual • Quality assessment of different designs • Breadth of designs and study focus • Variation in contribution of each study to the systematic synthesis

  11. Types of systematic synthesis • Numerical • Narrative empirical • Conceptual

  12. Numerical synthesis • For example: Statistical meta analysis of effect sizes from experimental studies of effect of interventions

  13. Does sex education improve the use of contraception amongst young people? From: DiCenso A, Guyatt G, Willan A, Griffith L(2002) Interventions to reduce unintended pregnancies amongst adolescents: a systematic review of randomised controlled trials. BMJ 231: 1426-1434

  14. Narrative empirical • For example, EPPI review of Personal Development Planning (for LTSN): types of PDP and types of outcome (indep. & depend. variables) • Map and in-depth synthesis stage • Weight of evidence • Synthesis from conceptual framework within SR question

  15. Meta ethnography • Develop new interpretative constructions • Data are concepts not empirical findings • Emphasis on relevance • Quality assurance: • exclusion criteria on quality E.g. Campbell R, Pound P, Pope C, Britten N, Pill R, Morgan M, Donovan J (2003) Evaluating meta-ethnography: a synthesis of qualitative research on layexperiences of diabetes and diabetes care. Social Science and Medicine 56(4):671-684 • worth of studies emerging during synthesis E.g.Noblitt G, Hare R (1988) Meta-Ethnography: Synthesizing Qualitative Studies. London: Sage. • application of weighting system (for e.g. EPPI’s procedures)

  16. User involvement, setting question and developing protocol Defining studies (inclusion and exclusion criteria) Searching exhaustively (search strategy) Describing the key features of studies Assessing their quality/ weight of evidence Synthesising findings across studies Communication and engagement Stages of an EPPI-Centre review MAP Possibly apply further inclusion criteria here IN-DEPTH REVIEW

  17. Systematic maps and systematic synthesis • Map: What has been done? • maps out research activity (e.g. broader question with multiple designs) • provides context for synthesis • research designs in primary studies part of that context • Synthesis: What is known from what has been done?

  18. USA Systematic mapping of research UK Study type by country of study in PDP review Australia 18 Canada Hong Kong 16 Netherlands Finland 14 Israel Japan 12 Spain 10 Belgium China 8 Singapore South Africa 6 Taiwan Unknown 4 2 0 Descriptive Exp of Rel Eval Nat Occ Eval RM

  19. ‘Weight of Evidence’* • A. methodological quality of execution of study (in its own terms) • B. appropriateness of study design to review question • C. relevance of focus of study to the review question • D. overall WoE provided by study to answering the review question review authors (not EPPI) determine WoE of each study on dimensions A, B,and C and then their relative contribution to total weight D *allows use of ‘best evidence’ (Slavin)

  20. Searching and screening Bibliographic Characterising studies EPPI-Reviewer keywording Data extraction and quality / relevance assessment EPPI-Reviewer data-extraction Numerical synthesis EPPI-Reviewer Narrative ‘empirical’ synthesis EPPI-Reviewer Thematic/ conceptual synthesis EPPI-R to NVivo Web based data input / coding and analysis; web based access to EPPI bibliographic and data-extraction databases, and review reports and summaries

  21. This flexible system allows us to… • Address any type of policy, practice or research question • Include more than one study type and more than one type of synthesis method in the same review • Use different approaches to quality assessment

  22. An EPPI-Centre review including more than one synthesis method Review question e.g. What is known about the barriers to, and facilitators of, fruit and veg intake amongst children aged 4 to 10 years? ‘Qualitative’ studies 1. Application of inclusion criteria 2. Quality assessment 3. Data extraction 4. Qualitative synthesis Trials 1. Application of inclusion criteria 2. Quality assessment 3. Data extraction 4. Statistical meta-analysis Trials and ‘views’ Mixed methods synthesis

  23. National and international collaborations undertaking systematic reviews • Cochrane Collaboration (C1) • Campbell Collaboration (C2) • NHS Centre for Reviews and Dissemination (CRD) • ESRC Centre at QMW and Nodes Network • Evidence for Policy and Practice Information and Coordinating (EPPI) Centre: Developing methods for education and the social sciences

  24. Thank you • d.gough@ioe.ac.uk • http://eppi.ioe.ac.uk

  25. 2005 CESC-SSHRC SymposiumOttawa, 26th May 2005Knowledge mobilization and policy making David Gough EPPI-Centre, Social Science Research Unit, Institute of Education, London, UK

  26. Key features of a systematic review • Synthesises the results of primary research • Uses explicit and transparent method • A piece of research, following standard set of stages • Accountable, replicable, updateable • Need for user involvement

  27. User question led synthesis • What do we want to know? • Who wants to know and why? • What do we know and how do we know it? • What more do we need to know and how can we know it? • Can not be value free • Involves intellectual work

  28. User involvement, setting question and developing protocol Defining studies (inclusion and exclusion criteria) Searching exhaustively (search strategy) Describing the key features of studies Assessing their quality/ weight of evidence Synthesising findings across studies Communication and engagement Stages of an EPPI-Centre review MAP Possibly apply further inclusion criteria here IN-DEPTH REVIEW

  29. Professional Experience & Expertise Pragmatics & Contingencies Political Judgement ResearchEvidence Lobbyists & Pressure Groups Resources Habits & Tradition Values Factors Influencing Policy Making in GovernmentDavies 2004

  30. Is research relevant to policy? • Habits and tradition • Experience and expertise • Judgement • Resources • Values • Lobbyists and pressure groups • Evidence

  31. Use of research evidence • Political –support prior view • Tactical – other purpose • Enlightenment –framing of issues • Problem solving– inform decision making • Naïve rational – lead to decision making From Weiss (1979)

  32. Experimental Quasi-Experimental Counterfactual Experimental Quasi-Experimental Qualitative Theories of Change Public Consultation Distributional Data Surveys Admin Data Comparative Qualitative Implementation Evidence Ethics Evidence Organizational Evidence Cost-Benefit Cost-Effectiveness Cost-Utility Econometrics Program or Intervention Effectiveness Surveys Qualitative Economic/ Financial Evidence Attitudinal Evidence Forecast Evidence Multivariate Regression Types of research evidence Lomas 2005 adapted from Davies, 2004

  33. A research impact continuumfrom Sandra Nutley 2005 CONCEPTUAL USE INSTRUMENTAL USE Awareness KnowledgeAttitudes Practice/policy /understandingPerception change

  34. Accessibility to inform naïve use! • Access to papers on primary studies • Completeness of information in papers • Guidelines for authors: - Consort Statement for Trials - Draft education guidelines

  35. Study of 489 published papers and survey of guidelines in 12 journals • incomplete reporting of primary empirical studies in education • little guidance for authors about the important information that they should report about their studies in some well known educational journals • Draft new guidance: Newman M, Elbourne D, Leask M (forthcoming) Improving the usability of education research:guidelines for the reporting of primary empirical research studies in education (The REPOSE Guidelines). Evaluation and Research in Education

  36. Knowledge transfer • Research literacy of policy makers • Research summaries, e.g. CCKM, CHSRF ‘Myth Busters’ and ‘Evidence Boost’ • Worked examples • Intermediaries: champions, translators • Intermediary organisations, e.g. NICE, SCIE, NERF • Interactive working

  37. Models of transmissionfrom Sandra Nutley 2005 Research-based practitioner Embedded research Organisational excellence

  38. Access to / transfer of what? • Quality assessment of recommended programmes • Vulnerability of individual studies • Distillation through systematic reviews • Quality assurance of reviews • Research evidence applicable locally? • Intermediaries to adapt but open to bias • Dangers of experts and panels

  39. Salience of the research • Timelines • Anecdotes and clear messages • Lack of research data (no strategic focus to research activity) • Research but not focused on policy makers needs • Users of research and agenda setting

  40. Perspectives and participation PERSPECTIVES AND PARTICIPATION What do we want to know? (i) Research (ii) User of service/ public (i) Research (iii) Practitioner (ii) User of service/ public (iv) Policy community (iii) Practitioner TYPES OF KNOWLEDGE (v) Organisational (iv) Policy community Interpretation and application What do you want to know? TYPES OF KNOWLEDGE Effects? Interpretation and application (v) Organisational How/processes? Nature,/extent/frequency? Effects? Perspectives/insights/concepts? RESEARCHQUESTIONS RESEARCH QUESTIONS How/processes? Nature/extent/frequency? Communication Perspectives/ concepts? Communication Action research What has been done? RESEARCH STUDIES AND METHODS Action research Survey What has been done? Survey Case study RESEARCH STUDIES AND METHODS Case study Experimental Experimental Multimethod Multi method RESEARCH EVIDENCE Effects? What do we know? How do we know it? What do we know? How do we know it? What don’t we know? How could we know it? How/processes? RESEARCH EVIDENCE Perspectives/insights/concepts? Nature/extent/frequency? Nature,/extent/frequency Perspectives/ concepts? How/processes? Effects

  41. Beyond knowledge transfer Linkage and exchange • Ongoing interaction, collaboration, and exchange of ideas between researcher and decision-maker communities Knowledge brokers and brokering • Links researcher and decision makers, facilitating their interaction so that they better understand each other’s goals and professional culture, influence each other’s work, forge new partnerships and use research based evidence

  42. Communities of practice • “An activity system about which participants share understandings concerning what they are doing and what it means in their lives and for their community” Lave & Wenger, 1991

  43. Limits to evidence use • “Decisions are less about projected consequences and more about process and legitimation. Politics is about shaping interpretations and expressing preferences. … Research… clarifi(es) issues and informs the wider public debate” Young et al (2002)

  44. Don’t forget the supply side • Still need investment in research and researchers! • In UK 2-3% of health budgets are on research. Same for education?

  45. Thank you • d.gough@ioe.ac.uk • http://eppi.ioe.ac.uk

More Related