1 / 38

Approaches to predicting College Student Success: Implications

Approaches to predicting College Student Success: Implications. J. B. Arbaugh, University of Wisconsin Oshkosh UMUC Learner Analytics Summit Jully 22, 2014. Research On Analytics in Higher Ed Beginning to Appear. Conference presentations galore Special issue of JALN (2012)

kalb
Download Presentation

Approaches to predicting College Student Success: Implications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Approaches to predicting College Student Success: Implications J. B. Arbaugh, University of Wisconsin Oshkosh UMUC Learner Analytics Summit Jully 22, 2014

  2. Research On Analytics in Higher Ed Beginning to Appear • Conference presentations galore • Special issue of JALN (2012) • Several articles in EDUCAUSE Review • Journal of Learning Analytics debuted in 2014 • Ethical questions regarding use of analytics in higher ed already being considered (Slade & Prinsloo, 2013)

  3. Some Observations from the Literature on Community Colleges and Analytics • Increasing attention given to analytics by entities such as AACC (Norris & Baer, 2013) • Analytics adoption at CCs seen as 2-3 years out (along with “next generation” LMSs), whereas flipped classrooms and social media seen as more pressing adoptions (Johnson et al., 2013) • Phillips & Horowitz (NDCC, 2013): • Data entry approaches are critical – consistency of entry, between new and old systems • Opportunities for trend analysis – either of a cohort of students or year by year comparisons in a program or college • Make data accessible and useful to ALL staff

  4. What/who gets little attention in this research? • Instructors!!! • One article found written from the instructors’ perspective (Dietz-Uhler & Hurn, 2013) • Instructors aren’t even considered as part of the learning analytics ethical equation (Slade & Prinsloo, 2013) • Even when instructors are considered, it is almost always from a post hoc perspective

  5. But don’t feel Too bad about this… • Instructors in virtual settings have a long history of being understudied: • Distance Education (Dillon & Walsh, 1992; Thompson, 2007) • Online Delivery in General (Dziuban, Shea, & Arbaugh, 2005; Shea, 2007; Shea, Pickett, & Li, 2005) • Online Delivery in Business Schools (Arbaugh, DeArmond, & Rau, 2013; Shea, Hayes, & Vickers, 2010)

  6. How to make instructors love data scientists (Fornaciari & Arbaugh, 2014) • Use predictive analytics to create a priori tools: • Class section composites • Collective demographics • Collective GPA • Prior online course experience • Prior online/blended course content access behavior • Prior online course discussion behavior

  7. Interventions: What to do with results of analytics? The case of “Mentoring” Peter Shea, Associate Provost for Online Learning & Associate Professor, School of Education University at Albany, SUNY

  8. Quick PSA • Editor for Journal of Asynchronous Learning Networks (JALN) (soon to be Online Learning) • You are invited! • http://jaln.sloanconsortium.org/ • Register to be considered as author, reviewer etc • http://jaln.sloanconsortium.org/index.php/index/user/register

  9. Major Challenge • Getting insight analytics to “front lines” • What should practitioners “do” with results? • Some typical answer: notify instructors or staff to better advise, support, tutor, mentor, instruct etc. students at risk • But what should these interventions look like? • Beyond analytics, what models and research inform the design of interventions? Especially for online students?

  10. Assumes these as “given” Institutional Experience Pre-entry Attributes Goals/ Commitments Goals/ Commitments Integration Outcome Why uni-variate outcome? Assumes these as “given” Where is institutional response? Where are “interventions”?

  11. Non-traditional students • ‘‘We label none of the thirteen propositions of Tinto’s theory as reliable knowledge [about] commuter colleges” (Braxton & Lee, 2005)

  12. Assumes these as “given” Bean & Metzner’s (1985) model for non-traditional students environmental variables academic variables environmental variables learner characteristics academic outcomes psychological outcomes decision to Drop-out learner characteristics psychological outcomes Assumes these as “given” Where is institutional response? Where are “interventions”?

  13. Newer Models • Newer models improve on past work • Multivariate outcomes for example • Still lacking attention to proactive intervention

  14. Still assumes these as “given” ? Interventions: For Retention or Persistence? Multivariate Outcomes Still assumes these as “given” ? (Falcone, 2012) Where is institutional response? Where are “interventions”?

  15. Empirical v Conceptual Models • Previous models have been largely based on conceptual work • Opportunities exist to test these empirically • Analytics clearly one way • Other ways too, for example:

  16. National Empirical Results Interventions targeted toward: • preventing delayed entry to college • increasing part-timers’ level of enrollment • boosting financial aid in community colleges • reducing students’ work hours have the greatest numerical potential for improving completion rates. Caveat: National results don’t tell me about me and perhaps too general to be useful…

  17. The issue with that advice… • “…thinking you can tell every student to switch from part-time to full-time is insane, trying to reduce every students work hours is nuts – just not going to happen .” • (Milliron, 2014)

  18. Example of Problem • Online learning at the community college level – what are the problems and what would an intervention look like and why?

  19. Online Learning Impedes cc progress? • Data: 24k students in 23 institutions in Virginia Community College System; 51k in Washington • Online learning outcomes at community college worse than classroom • Smith Jaggars & Xu, 2010; Xu & Smith Jaggars, 2011: • More failing/withdrawing from online courses • Online students less likely to return in following semesters • Online students less likely to attain a credential/transfer to 4 year institution.

  20. Conflicting Results • Shea & Bidjerano (2014). A national study of online community college students. Computers and Education, 75 (2), 103-111. • Net of 40 other factors DE/online learners were 1.25 times as likely to attain any credential • When credential goal was certificate (rather than BA) DE/online learners were 3.22 times as likely to succeed 

  21. Still more conflicting results: California Community Colleges • These mixed results recently extended • Nation’s largest two-year system enrolling nearly 1 million OL students • Methods employed and results similar to Washington and Virginia studies • Community college online learners less likely to complete online course and less likely to get passing grade. • BUT… more likely to graduate and transfer to 4 year college… • (Johnson, H. & Cuellar Mejilla, M., 2014. Online Learning and Student Outcomes in Community Colleges Public Policy Institute of California)

  22. Conclusion • Something funny is going on… • Online students fail courses at higher rates but graduate and transfer at higher rates? • More research is needed… • Clear consensus that online course level outcomes are uniformly worse is very troubling, how to address?

  23. Assumptions are one direction

  24. Two way model is possible

  25. Proposed new “transactional” model Shea & Bidjerano, 2014

  26. One Example: Mentoring Analytics indicate that certain students are at risk in online environments How do we develop an institutional responses? One example: Mentoring But what is it? Not as easy as it seems…

  27. Problem • “…there is a false sense of consensus, because at a superficial level everyone “knows” what mentoring is. But closer examination indicates wide variation in operational definitions, leading to conclusions that are limited to the use of particular procedures.” (Wrightsman, 1981, pp. 3–4, in Jacobi, 1991, p. 508)

  28. Not well defined • More than 50 definitions of mentoring used in the research literature (Crisp & Cruz, 2009). • The result is that the concept is devalued, because everyone is using it loosely, without precision, and it may become a short-term fad. (Wrightsman, 1981, pp. 3–4, in Jacobi, 1991, p. 508)

  29. Benefit • This framework provides a detailed yet concise method of communicating exactly what we mean when using the word mentoring • May also act as a useful set of prompts for educators designing new mentoring interventions.

  30. Interventions • Designing a specific intervention based on identification of at risk students through analytics is a complex process • Intervention needs to be highly specified to demonstrate effectiveness and be replicable

  31. Components of a Mentoring Model • Objectives: The aims or intentions of the Mentoring Model • Roles: A statement of who is involved and their function • Cardinality: numbers involved in mentoring relationship (one-to-one, one-to-many) • Tie strength: the intended closeness of the mentoring relationship (intensity) • Relative seniority: comparative expertise, or status of participants (peers, staff, faculty) • Time: length of a mentoring relationship, regularity of contact, and quantity of contact (dosage) • Selection: how mentors and mentees are chosen • Matching: how mentoring relationships are composed

  32. Components of a Mentoring Model • Activities: actions that mentors and mentees can perform during their relationship • Resources and tools: technological or other artifacts available to assist mentors and mentees • Role of technology: the relative importance of technology to the relationship (F2F, Online, Blended) • Training: how necessary understandings and skills for mentoring will be developed in participants • Rewards: what participants will receive to compensate for their efforts • Policy: rules and guidelines on issues such as privacy or the use of technology • Monitoring: what oversight will be performed, what actions will be taken under what circumstances, and by whom • Termination: how relationships are ended

  33. Developing a Mentoring Intervention • Would include analysis of all 16 elements • Development can’thappen in a vacuum • Must be a researcher-practitioner collaboration (2yr, 4yr, University, etc) • Sustained process addressing specific problem • inputs • activities • outputs • outcomes • impact

  34. Partial Example • Development of a replicable, community college, online-faculty, mentor model to address academic and social disengagement through group mentored instruction in planning, time management, & self regulation, with additional peer-mentored social-media networks to support at-risk student engagement and persistence to increase graduation and transfer.

  35. Questions • J. B. Arbaugh, University of Wisconsin Oshkosh • arbaugh@uwosh.edu • Peter Shea, University at Albany, SUNY • pshea@albany.edu

More Related