1 / 15

“Synthesizing the evidence on the relationship between education, health and social capital”

“Synthesizing the evidence on the relationship between education, health and social capital”. Dan Sherman, PhD American Institutes for Research 25 February, 2010 Oslo, Norway. How to Synthesize? And for Whom?. Obviously an enormous number of studies in most fields, even for narrow questions

temima
Download Presentation

“Synthesizing the evidence on the relationship between education, health and social capital”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. “Synthesizing the evidence on the relationship between education, health and social capital” Dan Sherman, PhD American Institutes for Research 25 February, 2010 Oslo, Norway

  2. How to Synthesize? And for Whom? • Obviously an enormous number of studies in most fields, even for narrow questions • What should approach be to reviewing studies, summarizing outcomes, and then evaluating “quality” of studies? • How should syntheses be prepared and distributed? • What are problems with research syntheses?

  3. Challenges to Synthesis • Studies can use different design, different populations, different sample sizes • Studies of “effect” typically make some comparison on one group to another, • e.g., those with more education • e.g., those enrolled in an early childhood program • Many validity problems in studies • Selection effects, consistency/fidelity of adoption, attrition • Need to decide how to rank and present evidence • For policy, must be accessible and useful

  4. US What Works Clearinghouse (WWC) Starting 2002, US Dept of Education set up WWC to: • “Produce user-friendly practice guides for educators that address instructional challenges with research-based recommendations for schools and classrooms; • Assess the rigor of research evidence on the effectiveness of interventions (programs, products, practices, and policies), giving educators the tools to make informed decisions; • Develop and implement standards for reviewing and synthesizing education research” Source: http://ies.ed.gov/ncee/wwc/

  5. Approach of WWC • Attempts to provide systematic evidence reviews in areas of educational practices - has worked in 8 separate topic areas (e.g. early reading, math instruction) • Has very high standard, favoring Randomized Control Trials (RCTs) versus Quasi-Experimental Designs (QEDs) • Project leads to quality of evidence reports and “practice guides” based on good evidence

  6. WWC Approach Favors RCTs

  7. Sample WWC Outputfor 25 Early (K-3) Reading Programs

  8. Few Studies Meet WWC Standard for Inclusion • Large numbers of studies reviewed • Relatively few meet evidence standards • Prefer RCTS • Good matching • Low Attrition • WWC reduces further with substantive (vs statistical) significance

  9. Outcome of WWC • Very small number of studies high meet evidence standards • Really is “Almost Nothing Works” Clearinghouse • WWC Procedures and Standards Handbook is very useful reference to evidence review - Describes/discusses standards • Produce practice guides and “Doing What Works” multimedia presentations on web aimed for educators

  10. Analogy with Health Research • Example of Agency for Healthcare Research and Quality (ARHQ) • Nearly 200 completed “evidence based “ practice reports mostly on medical treatment • Summarize literature and present estimated effects with meta-analytic techniques • Readily available to search on ARHQ website • Presents strength of evidence based on design, sample sizes, etc • Medical model easier to have small RCTs and controls (e.g., placebo drugs)

  11. ARHQ: Strength of Evidence Grades • High true effect. Further research is very unlikely to change our confidence in the estimate of effect. • Moderate confidence that the evidence reflects the true effect. Further research may change our confidence in the estimate of effect and may change the estimate. • Low confidence that the evidence reflects the true effect. Further research is likely to change our confidence in the estimate of effect and is likely to change the estimate. • Insufficient Evidence either is unavailable or does not permit estimation of an effect.

  12. Comparison of AHRQ and WWC • ARHQ able to draw on larger literature • Easier design issues in medical studies • (less concern of selection/confounding factors) • Have much higher “pass rate” for health studies • Practice guides may be more “current” in that base literature more quickly updated • More “acceptable” studies being produced in health – they are standard in drug treatment literature • Health studies can generally be more focused to specific “treatment” vs. education that has many confounding components (which favors RCTs)

  13. General Challenges in Research • Most research looks at outcomes/effects in short-term • Does end-of-year student achievement improve for fifth graders if teachers use specific reading method ? • Does drug reduce blood pressure? By how much? • Key question is does long-term, sustainable effect exist? • Do measured effects apply to large population? • If data obtained for other reason than evaluation study, can we use them for evaluation? • What methods to apply? What are comparison groups? Can we match individuals?

  14. Questions for Thought/Discussion • What is good evidence of effect to guide policy? • How high a standard is needed? • Must we have (expensive) RCTs for everything? • What data should routinely be gathered to support later studies, whatever method? • Important policy question: what data to collect?` • How best to synthesize and report evidence to be useful to policymakers AND practitioners? • Doing What Works and not just Knowing What Works (or thinking we know)

  15. Working to Improve Health in US Boston Marathon, 2009 40 Km done, 2 to go!

More Related