1 / 17

Initial Evaluation Design

This study examines the effects of CDDRE participation on schools using a delayed treatment design. It compares treated schools with untreated schools matched on prior achievement, demographics, and urban/rural location. The research aims to determine the impact of CDDRE participation and the effectiveness of proven models.

cyrah
Download Presentation

Initial Evaluation Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Initial Evaluation Design Random assignment of 59 districts in 3 cohorts starting in 2005, 2006, and 2007 Delayed treatment design However, uneven paces of implementation made design inappropriate

  2. Comparisons with True Controls • Each school matched with untreated school in same state on: • Prior achievement • Demographics • Urban/rural • Schools followed up to 4 years • Data combined across states (z-scores)

  3. Participating States Pennsylvania (28) Mississippi (4) Ohio (1) Indiana (2) Arizona (4) Tennessee (10) Alabama (2) 272 Elementary and 152 Middle Schools

  4. Percentage of Schools Choosing a Reading Intervention Grade 5 2006 Cohort (3 years) 30% 2005 Cohort (4 years) 42% Grade 8 2006 Cohort (3 years) 28% 2005 cohort (4 years) 33%

  5. Research Questions In comparison to non-CDDRE schools, what were the effects of CDDRE participation? How did effects vary for schools that did or did not choose proven models?

  6. Figure 1

  7. Figure 2

  8. Figure 3

  9. Figure 4

  10. Figure 5

  11. Figure 6

  12. Figure 7

  13. Figure 8

  14. Conclusions A focus on data alone does not add to outcomes Implementation of benchmark assessments only moderately adds to outcomes What significantly adds to outcomes is adoption of proven school-level programs

  15. Implications for Policy Focus on helping schools adopt proven programs • Review research on programs (as in BEE, WWC) • Provide incentives, assistance to use proven programs • Invest in creation of new programs (as in i3) • Invest in scale-up of proven programs (as in i3)

More Related