1 / 30

Evaluation of Online Student Success: Actionable Research Outcomes

This study examines student outcomes in online education compared to face-to-face instruction using propensity score matching. The findings highlight the importance of continuous quality improvement and implementing best practices in online courses. The results inform UCI's efforts to improve online offerings and enhance student experiences.

baxterc
Download Presentation

Evaluation of Online Student Success: Actionable Research Outcomes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Evaluation of Online Student Success: Actionable Research Outcomes Preston Reed Principal Research Analyst, Office of Institutional Research preston.reed@uci.edu

  2. Online Education

  3. National Online Enrollments (IPEDS)

  4. UC Online Enrollments (IPEDS)

  5. UCI Strategic Plan Online education found a home in our strategic plan

  6. Online vs Face-To-Face

  7. Online vs. Face-to-Face: (Dis)parity • Mixed findings • Dept. of Ed large meta-analysis (Means et al., 2010) • All experimental or quasi-experimental • Found outcomes were *better* for online compared to face-to-face (F2F) • “Online penalty” found in other studies • Large observational CC studies have found performance gaps (e.g., Xu & Jaggars, 2014)

  8. Online Ed: Best Practices So what are some of the “best practices”? • Organization and Presentation • Learning objectives and assessments presented clearly • Interpersonal interaction • Appropriate use of technology

  9. What does that mean for UCI?

  10. Continuous Quality Improvement (CQI) • Integrating quality improvement into the daily ongoings of the system (Park et al., 2013) • CQI personified by 3 characteristics • Frequency • Depth • Context within the system

  11. CQI at UCI

  12. The present study • Goal to examine student outcomes in success for online relative to F2F

  13. Key outcome • Student success • Successful: • Earned a grade of “Pass” or “C” or better • Unsuccessful: • Earned a grade of “No Pass,” “C-” or lower • Dropped after census • Withdrew from the course

  14. Selection Criteria • Data collected from 14 lower-division courses over 14 terms

  15. Analyses • Propensity Score Matching (PSM) used to match enrollments on several key demographic and academic characteristics

  16. In Brief: What is PSM? • Looks at set of predictor variables to predict likelihood of being in treatment or control group • Matches people in both groups on predicted probability • This incorporates confounding variables • Can be easier to communicate findings

  17. Matching Characteristics • Grouped into • Demographics • E.g., age, gender, low-income • Academics • E.g., SAT Scores, GPA, Enrolled units

  18. Analyses • Resulted in 8,374 matched enrollments • 58 course offerings • 4,187 Face to face • 4,187 Online • Post-matching, no significant differences between groups on any matched variable

  19. Results • Across all courses: • Students in online courses were less likely to succeed than those in face-to-face courses • 84.4% vs. 90.8%, p < .001 • Small effect size r2 = .01

  20. Results by Course • Success rates between online and face to face methodologies varied between courses.

  21. Results by Course

  22. Courses with significant success discrepancies

  23. Bringing Findings to Key Stakeholders • Findings supported exactly what administrators would have thought • Four courses with high discrepancy • Not using current best practices • Mostly passive online reading assignments.

  24. Limitations • Findings based on a specific subset of enrollments and online course offerings • e.g., offered online and face to face same term, more than 100 enrolled in each medium; students who entered UCI as freshmen • Need to go beyond “Success” or “Failure” as a measure of instruction method efficacy • For example, how students perform in next course in series (e.g., Bio 93 to Bio 94)

  25. Moving forward: How UCI is using this information • Used analyses as leverage to approach instructors to retool courses to incorporate best practices. • Using class with very little discrepancy as model for other classes • Using funding from UCOP and other initiative funding to retool courses

  26. Moving forward: How UCI is using this information? • Will follow up with classes in the future

  27. Takeaways • Can’t simply assume prior findings applicable at your campus • Even if you have studied online ed at your campus, it may not be the same for all offerings

  28. Takeaways • Should continuously evaluate and improve offerings at your campus • If things go right, may result in real changes that improve students’ experiences at your school • Be sure to know your stakeholders • More likely to enact change

  29. Thanks! • Colleagues at UCI OIR • Sarah Eichhorn, Associate Vice Provost for Teaching and Learning • UCI Teaching and Learning Research Center • Di Xu, Assistant Professor – Education • UCI Digital Learning Lab

  30. Contact information:Preston Reedemail: preston.reed@uci.edu Questions?

More Related