1 / 25

Studying the use of research knowledge in public bureaucracies

This study examines the use of research evidence among policy analysts working in health and non-health ministries. It identifies significant correlates of research use and provides empirical evidence on the association between direct interactions with researchers and research use. The study has some limitations, such as self-reported data and observational nature of the study.

workmans
Download Presentation

Studying the use of research knowledge in public bureaucracies

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Studying the use of research knowledge in public bureaucracies Mathieu Ouimet, Ph.D. Department of Political Science Faculty of Social Sciences CHUQ Research Center KT National SeminarSeries December 9, 2010 12:00- 13:00 ET

  2. Learning objectives • To define research knowledge and use • To report the results of a cross-sectional study of the use of research evidence in health and non-health ministries • To invite researchers to use a variety of methodological approaches

  3. 1. Key definitions and clarifications

  4. Defining scientific knowledge • Seldom conceptually defined in RU studies • Unresolved demarcation problem • KKV* definition of scientific research • The goal is inference - causal or descriptive • The procedures are public - to allow assessment • The conclusions are uncertain – this should be made explicit • The content is the method - rather than the subject matter • Object of contention - Nomothetic vs Idiographic • * King G, Keohane RO, Verba S. (1994). Designing Social Inquiry. Princeton (New Jersey): Princeton University Press.

  5. Defining research utilization (RU) • Instrumental / conceptual / symbolic • RU standards (Knott & Wildavsky, 1980) – Outcomes to measure • Reception - when studies reach users • Cognition - when studies are read, digested and understood • Reference – when studies change users’ frame of reference • Effort – when users fight for the adoption of studies’ recommendations • Adoption – when studies influence policy adoption • Implementation – when studies influence policy implementation • Impact – when policy stimulated by studies yields tangible benefits (outcomes)

  6. 2. Cross-sectional study of policy analysts in health and non-health ministries

  7. STUDY AIM • OVERALL OBJECTIVE: to identify significant correlates of research use among policy analysts working at the ministerial level. • SPECIFIC OBJECTIVE: to provide empirical evidence on the magnitude of the association between direct interactions with researchers and research use, while adjusting for other correlates

  8. STUDY LIMITATIONS • Cross-sectional nature of the data • Self-reported data (social desirability bias, recall bias) • Study does not document what determine which research articles, reports or books the policy analysts read • Observational rather than experimental (study misses the required step of demonstrating experimentally that changes in the correlates will have the desired effects and are not simply manifestations of some deeper cause).

  9. METHODS /1 • DESIGN:A random-digit-dialing telephone cross-sectional survey. • PARTICIPANTS:Policy analysts defined as civil servants belonging to 14 professional groups. • SETTING: 17 ministries, including the Health & Social Services • DATA COLLECTION: Questionnaire administered by a small survey firm between 26 September and 25 November 2008 using the CATI technology, which allows for simultaneous data entry and data coding. • N= 1614 (response rate = 62.48%)

  10. METHODS /2 • SURVEY QUESTIONS:Onlyclosed-ended questions • MAIN OUTCOMES: • Consultation of scientific articles • Consultation of academicresearch reports • Consultation of academic books (chapters)

  11. METHODS /3 • 3 ordinal regression models (3 outcomes) • Modifiable correlates (eg direct interactions with researchers, perceived relevance of academic research, etc.) • Unmodifiable correlates (eg Formation type, displinary fields of training, gender, age, policy sectors, policy stages, etc.) • Post-estimation simulations

  12. Percentage distribution for the correlates considered in the study /1

  13. Percentage distribution for some correlates considered in the study /2

  14. Percentage distribution for the types of documents consulted monthly or weekly (n= 1614)

  15. Consultation of different types of documents – Health and Social Services (n= 100) (Monthly & Weekly consultation combined)

  16. Consultation of scientific articles across policy sectors (monthly & weekly consultation combined)

  17. Percentage distribution for the three outcome variables acrosspolicy sectors (monthly and weekly consultation combined)

  18. Correlates positively and significantly associated with the three outcome variables - - - - - - - - - - • Only two correlates were not significantly associated with any outcome variable • gender and being solely involved in policy evaluation rather than in policy formulation

  19. Statistical simulations /1 • Eachunmanipulablecorrelatewasfixed to a specific value usingdescriptive statistics as a guideline. • Fixedunmanipulablecorrelates: • disciplinaryfield of training (fixedat = human and social sciences); • English reading (fixedat = yes); • type of studiespreferred (fixedat = quantitative studies); • age (fixed at = 40–49 years); • gender (fixed at = men); • production of written advice (fixed at = yes); • proportion of working time spent in meetings (fixed at = second quartile,6.68%–14.28%); • policy stages (fixed at = two policy stages).

  20. Statistical simulations /2 • Four manipulable correlates shifted simultaneously from their minimum (0) to their maximum value (1): • interactions with researchers in human and social sciences; • continuing professional development involving scientific content; • reported access to electronic bibliographic databases from own workstation; • perceived relevance of academic research evidence. • Combined marginal effect computed 16 times, for each policy sector (then the average was calculated). • Same procedure repeated by changing the training type value from ‘undergraduate’ to ‘research Master’s/PhD’ • Also reported: lowest and highest combined marginal effects observed in specific policy sectors.

  21. Percentage points increase or decrease in the probability of weekly consultation of scientific articles* *Simulated marginal effect of a simultaneous change in modifiable correlates on weekly consultation of scientific articles

  22. 3. Some challenges for future research in this field

  23. Research challenges • Measuring research use objectively in public bureaucracies • Measuring research use according to other standards (eg benefits, health outcomes, etc.) • Documenting the research knowledge infrastructure found in ministries and studying its effect on policy analysts’ utilization behaviour • Opening up the black box of research knowledge • Opening up the black box of direct interactions • Conducting experimental research in ministries and agencies

  24. Thank you

More Related