1 / 39

Lessons Learned from the National Evaluation of Upward Bound and New Directions in light of 2008 HEOA

Lessons Learned from the National Evaluation of Upward Bound and New Directions in light of 2008 HEOA. SFARN Conference Margaret Cahalan US Department of Education June 2009

Michelle
Download Presentation

Lessons Learned from the National Evaluation of Upward Bound and New Directions in light of 2008 HEOA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lessons Learned from the National Evaluation of Upward Bound and New Directions in light of 2008 HEOA SFARN Conference Margaret Cahalan US Department of Education June 2009 All tabulations and views reported in this presentation are the sole responsibility of the author and do not reflect any review, authorization or clearance by the US Department of Education

  2. Topics • History Background—PART—Context-TRIO-HEOA • Transparent critical look at issues and re-analyses • Discuss Lessons Learned & New Models for Evaluation & Example from GEAR UP plans

  3. Clarification of What Presentation is Not Intended to Be • Critique of randomassignment-recognize power of method and hope will improve its application • Critique of Mathematica —disagree with some procedures and conclusions they reached but respect greatly the hard work and determination of completing this study • Act of Advocacy for the program —am acting as a researcher concerned with meeting research standards

  4. Upward Bound—2008 Overview • Year begun: 1965 • Student Targeted: High School Students • Total funding: $328,528,379 • Number of grants: 1017 • Number served: 71,247 • Average award: $323,036 • Amount per person served: $4,611 • Avg. number funded to serve per project: 70

  5. Study History Controversial from beginning--Combined Random assignment probability sample weighted to produce nationally representative impact estimates Second national evaluation-first one RTI in 1970s Study was first random assignment study of UB: Begun in 1992—2004: Four published impact reports 1996, 1999, 2004; Fourth follow up report unpublished Fifth Follow up report—Published in January 2009

  6. Policy History Third Follow up--- reported few average overall effects; but large effects for students at-risk academically and with lower educational expectations Newly devised OMB PART rating of “ineffective”; UB Initiative to serve at-risk--2002 Budget ---zero funding of federal pre-college programs in FY05 and FY06—dropped in FY07 and FY08

  7. Policy History (cont) • UB 2006 Absolute Priority to serve 1/3 at-risk and 9th grade ; • New random assignment study to evaluate begun 2006 • Congress blocked in 2007 and cancelled by ED in 2008 • HEOA 2008 • Mandates rigorous evaluations in TRIO • Prohibits over-recruitment for random assignment denial of services • Absolute Priority cancelled

  8. Personal Involvement History • PD for SSS and TS evaluations—survey director for third follow up—UB performance report analyses • Joined ED in late 2004 and study was under my team and working on Fourth Follow up • Concern when lead analyst from Fourth Follow up sent tables showing results sensitive to only one project • project had 26 percent of weight • seemingly large negative impacts • Review of fourth report concerns---2006-2008 began to look at data and consult with others concerning issues

  9. Basic Assumptions of Random Assignment Studies Sample representative of population to which wish to generalize Treatment and control group are equivalent Treatment and control group treated equally except for the treatment Treatment and control group are mutually exclusive with regard to the treatment

  10. Examined 5 Issues • Sample Design • Treatment Control Group Equivalency • Study attrition and non-response bias • Standardization of outcome measures • Control group service substitution and treatment drop-out

  11. 1. Sample Design Issues Sample highly stratified—46 for 67 projects Unequal weighting---One project has 26 percent, 3 projects 35, and 8 projects 50 percent of weight Project level stratification—339—strata unequal within projects – average of 8 members per strata—target 3000 sample-- Basic Design Flaw--One project strata for largest strata—unable to estimate sampling variance Serious representational issues with project 69—former 2-year representing 4-year Treatment-control non-equivalency introduced by outlier 26 percent project

  12. 2. Treatment–Control Non-Equivalency Sample well matched without project Project 69 introduces bias into the overall sample in favor of the controls Project 69 has large differences (examples) Education expectations: 56 controls expect advanced degree—15 percent treatment 9th grade academics—8 percent controls at risk; 33 percent treatment at risk Expected HS grad is 1997—60 treatment and 42 controls

  13. Expectation of Advanced Degree

  14. Imbalance in Project 69

  15. Balance in 66 other Projects

  16. Bias in Overall Sample with 69

  17. Bias Introduced by Project 69

  18. 3. Lack of Outcome Standardization to Expected High School Graduation Year (EHSGY) Multi-grade study cohort (7 to 10 in 1992-93) Randomization took place over 18 months Small unbalances between treatment and control ---Control has larger percentage of older 10 grade students Constructed standardized measures based on EHSGY

  19. 4. Survey Attrition and Non-Response Bias Concern in longitudinal studies UB rates very high for follow ups but at 74 percent by end—control group 4-5 percent less response rate Positive outcomes more likely to respond Reports through fourth did not use Adm records---Use federal aid files to observe and impute

  20. Differences Between Responders and Non-Responders in percent found on aid files 1994-2004

  21. 5. Service Participation and non-Participation Issues Waiting List Drop-Outs --26 percent of treatment coded as not accepting the opportunity on waiting list file—kept in treatment sample First Follow-up survey 18% non-participation in neither UB or UBMS in treatment group Survey data--12-14 percent controls evidence of UB or UBMS participation 60 percent controls and 92 percent treatment group reported some pre-college supplemental service participation

  22. Dealing with Issues --Approach of Re-Analyses • Sample design issues—present data weighted and unweighted • Bias in favor of Control Group • Present with and without project 69 • Consider estimate for 74 percent not represented by 69 more robust • Standardize outcome measures by expected high school graduation • Positive survey response bias—used federal aid file records and NSC for some analyses • Substitution and Drop out—additional observational analyses

  23. Alternative Analyses Experimental Analyses Logistic regression--Intent to treat (ITT)—UB opportunity--original random assignment Instrumental Variables Regression--Treatment on Treated (TOT)/ Complier Average Causal Effect (CACE)-UB/UBMS participation Quasi-experimental—Observational-Instrumental variables regression UB/UBMS compared to non-UB/non-UBMS service Any service compared to no service Selected subgroup (academic risk-and educational expectations)

  24. Baseline Variables in Model Used --sex, race/ethnicity, low income and first generation status, educational expectations, grade in 1992-93, grade on the student selection form, and whether the participant reported they had pre-college services prior to random assignment Not use academic indicators in 9th grade

  25. Aspirations at Base Line • 97 percent some postsecondary • 72 percent BA or higher • 25 percent below BA postsecondary • 3 percent no postsecondary

  26. Descriptive • Enrollment • 68 percent in +18 months from EHSGY • 70 in +4 years from EHSGY • 75 by end of study period 7-9 years out • Postsecondary –any degree or certificate 35-47% • 47 percent any postsecondary degree or certificate by 7 to 9 years using only survey responders adjusted for non-response • 35 percent using complete sample with NSC data • BA degree—just over 20 percent by end of study period--+8

  27. Impact Results Significant and substantial positive ITT and TOT findings weighted and unweighted and with and without project 69 for: Evidence of postsecondary entrance in +18 months and for +4 years Application for financial aid in +18 months and for +4 years Evidence of award of any postsecondary degree or credential by end of study period---driven by project 69 less than 4-year programs

  28. Evidence of Enrollment from any survey or federal aid files

  29. Applicant on Aid File

  30. Awarded a BA in +6 years of EHSGY • Weighted with 69 not sign. Unweighted sign. • For the 74 percent of sample not represented by project 69 • 28 percent increase in BA award for ITT UB opportunity (13.3 increased to 17.0) • 50 percent increase in BA award for TOT UB participation analyses (14.1 to increased to 21.1)

  31. Sub-Group Analyses • Bottom 20 percent on academic indicators • Large positive significant effects for: • Postsecondary entrance • Application for financial aid • Award of any postsecondary degree • Not for BA degree –two few achieved to compare treatment and control • Top 80 percent on academic indicators Moderate positive significant effects for: • Postsecondary entrance • Application for financial aid • Award of any postsecondary degree • For BA degree in +6

  32. UB/UBMS Participation Compared with Other non-UB/UBMS Services Participation Quasi-experimental--Uses 2-stage instrumental variables regression—controls for selection bias not eliminate Found statistically significant and substantive positive results for UB/UBMS participation for: Evidence of postsecondary entrance +1 and +4 Application for financial aid +1 and +4 Award of BA in +6 unweighted overall and unweighted and weighted without project 69

  33. Lessons • Need to pay attention to sampling and non-sampling error issues even in random assignment studies—evaluation of evaluations • More listening to those being evaluated concerning what will work and not work—engagement of those evaluated- • Transparency—multiple groups analyzing data • ED use partnership model with projects for future UB studies • Gap between expectations and attainment—72 percent expect BA degree—just over 20 percent attain in +8 years • HEOA:2008 Contains limitation on certain types of random assignment studies that require recruitment of more students than would normally recruit for evaluation study purposes • How best to target resources and students to achieve goals and program improvement • How to focus program---How to serve at-risk students?

  34. Next Steps—Trying with GEAR UP • Partnership model among stakeholders • Coordinated efforts across grantees • Utilized resources/leverage academic institutional research offices of grantees • Focus on program improvement rather than up or down • Open and transparent sharing • Build capacity for self evaluation and accountability • Utilization of standards for statistical research and program evaluation

  35. GEAR UP Next Generation • Concept of using data as feedback for program improvement • About to issue call for grantee partnerships to submit statements of interest in planning awards for rigorous studies—June 2009— • Small planning/proposal awards September 2009 • Implementation awards 2010-2012—

  36. Contact Information • Margaret.Cahalan@ed.gov • 202-401-1679

More Related