1 / 26

Impact of a Summer Bridge Program on Math Skills

Impact of a Summer Bridge Program on Math Skills. Randall Hickman Deirdre Syms Macomb Community College 2012 MIAIR Conference. Introduction – National Issue. Summer Bridge Programs Goals Improve High School graduate college readiness in key subjects: math, reading and English

joshwa
Download Presentation

Impact of a Summer Bridge Program on Math Skills

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Impactof a Summer Bridge Program on Math Skills Randall Hickman Deirdre Syms Macomb Community College 2012 MIAIR Conference

  2. Introduction – National Issue • Summer Bridge Programs Goals • Improve High School graduate college readiness in key subjects: math, reading and English • Increase student retention • Remedial reading level is leading predictor of college dropout • Increase student degree completion • Reduce the cost of remediating students

  3. Introduction – Macomb Research • Research on the effectiveness of accelerated developmental mathematics summer workshops in improving student outcomes • Outcomes measured as • Change in Compass math placement test scores, before and after workshop participation • Change in student’s level of perceived math self-efficacy

  4. Introduction – Macomb Research -

  5. Introduction - Intervention • Target Population • Rising high school seniors placing into remedial math • Four workshops • Fractions in Action • Decimals: What’s the Point • Do You Have the Power? Working with Exponents, Square Roots and Scientific Notation • Factoring Polynomials with Ease

  6. Introduction - Intervention • Implementation • Compass pre-test and initial collection of survey data occurred in May • Workshops occurred in June • Compass post-test and final collection of survey data occurred in June immediately upon completion of workshops

  7. Intervention – Five Key Elements • Parental Involvement • Invitation to an orientation with informational materials • Orientation at HS with principal and Macomb rep. • Incentive of 3-4 credit tuition voucher • High School Involvement • Principal and teacher selection of candidates based on test scores and ability to benefit • Math teacher participation during workshops and transportation • Support Service – free bus transportation

  8. Intervention – Five Key Elements • Accelerated instruction • Two-day, three-hours per day, on each workshop topic • Students recommended for any combination of workshop based on Compass subject test results • Guided lecture by experienced college professors • Academic Support • Tutorial assistance from Macomb honors students • Supplementary materials for use outside of workshop

  9. Methodology • Design – one group with pre- and post-test measures • Sample • Total N = 65 (16 for Compass, 8 for perceived self-efficacy change score analyses) • Variables • Pre- and post-test Compass math scores (Pre-Algebra, Algebra, College Algebra) • Pre- and post-test perceived self-efficacy • Four measures, one specific to each skill set • Fractions • Decimal • Powers and Square Roots • Polynomials

  10. Methodology • Variables • Change scores (post-test minus pre-test) for Compass and four perceived self-efficacy measures • Overall GPA • Math GPA • High school math courses taken • Program participation

  11. Methodology • Analysis • Given the small N for the pre-and post-test measures and change scores, and sample characteristics strongly indicative of non-normality, nonparametric bootstrapping was used to generate empirical distributions of key statistics • Accelerated bias-corrected confidence intervals were used • All bootstrapped statistics derived from 1000 bootstrapped samples

  12. Results • Students electing to participate in the workshops differed from those who did not • Lower mean overall GPA (2.39 vs. 2.67, p ≤ .05) • High School A: 2.47 vs. 2.49, n.s. • High School B: 2.30 vs. 2.88, p ≤ .05 • Significantly lower mean pre-test Algebra score for both HS (20.8 vs. 34.7, p ≤ .01) • 95% CI for participants: 14.9 – 25.9 • 95% CI for non-participants: 29.8 – 39.7 • Lower mean perceived self-efficacy (pre-test) (7.65 vs. 7.82, n.s.)

  13. Results • Students’ perceived self-efficacy with respect to math “tracks” reality (i.e. Compass score as an objective measure) at least to some extent • Compass pre-test Pre-Algebra score associated with mean pre-test perceived self-efficacy rating: r = .51, p ≤ .001

  14. Results – Compass Test Change • Placement into college-level math: • Pre-test - 19% • Post-test - 31%

  15. Results – Compass Test Effect Size Of the 33 Compass change scores different from zero, 22 were positive (p ≤ .01, Fisher one-sample randomization test)

  16. Results - Macomb and Texas Outcomes • Texas Summer Bridge Program Research Project • Compared pre- and post-program test scores • Conducted effect size analysis

  17. Results – Perceived Self-Efficacy

  18. Results – Perceived Self-Efficacy • Mean perceived self-efficacy change score • Mean: .88 (95% CI: .08 – 1.69) • Median: .63 (95% CI: -.25 – 1.88) • Evidence of a ceiling effect • Correlation between pre-test and change score: -.88 (p ≤ .01) • Mean perceived self-efficacy change score negatively (weakly) associated with Compass change scores (n.s.)

  19. Discussion – Methodological Issues • Small N limited statistical power • Absence of a control group limited the meaningful options for analysis • Reasonable assumptions • Students took the pre- and post-test measures seriously • No test-retest learning took place • Limited time between pre- and post-test measures ruled out maturation • Given the nature of the post-test measure, an external event is unlikely to be an explanation

  20. Discussion – Lessons Learned • Complications in Implementation • High School A and B appear to have used different criteria in candidate selection. • High School A had large proportion of low motivation students • High School B had large proportion of non-remedial students • Many students invited to participate by HS B were not in the target population • Low participation • Smaller N and therefore higher level of uncertainty concerning impact • Cost issues

  21. Discussion – Lessons Learned • Complications in Implementation • Lack of follow-through from high schools in providing student data (HS GPA, Math Course PGA, Math course list, STAR grade equivalency data) • Lack of follow-through in administering post-program survey • Led to unfinished research on student characteristics

  22. Discussion – Unresolved Issues • Planned research • What accounts for large variation in change scores? • What student characteristics were most strongly associated with gains in scores and gains in perceived self-efficacy? • Were some components (workshops) of the program more effective than others?

  23. Conclusion • Evidence suggestive of a positive impact of the bridge program on math skills and on perceived self-efficacy with respect to math • Considerable variation in change scores • Suggests that student characteristics play an important role in mediating the impact of a bridge program • Well-planned and well-executed implementation is important for helping to ensure • Adequate participation rates • Thorough evaluation of program

  24. Randall Hickman Deirdre Syms Macomb Community College hickmanr@macomb.edu symsd@macomb.edu

  25. Additional Discussion – Methods Regression to the mean not an issue because the “assignment” process (the process by which students ended up in the bridge program) was unrelated to the pre- and post-test measures Change scores may reflect limitations on test-retest reliability in addition to changes in math skill level Equal change scores do not necessarily represent equal changes in ability

  26. Additional Discussion – Lessons Learned • Next Bridge Program Design Changes: • Lead professor recommends: • Shorter workshops • Schedule during school year, not summer • Need incentive directed toward student, not parent • Tuition waiver is poor incentive for HS juniors. • Need student buy-in (appears only HS and parents were on-board)

More Related