1 / 40

Student-Pathway Analytics Using Microsimulation

Student-Pathway Analytics Using Microsimulation. Ron Anderson rea@umn.edu. October 27, 2011.

hansel
Download Presentation

Student-Pathway Analytics Using Microsimulation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Student-Pathway Analytics Using Microsimulation Ron Anderson rea@umn.edu October 27, 2011 This project was funded by the National Science Foundation ATE Program for Targeted Research. The grant was to Colorado University’s DECA Project, Liesel Ritchie, PI, with a subcontract to Rainbow Research for Project I, Strategies for Improving Recruitment, Retention and Placement.

  2. Overview of the Challenge • Community College completion rates embarrassing low at 20 to 40% within 8 years. • Advanced Technology Programs (ATP), while not as bad as non-ATP programs, still lose over 50% of their students before completion. • Gender inequality, a serious problem in NSF ATE projects • Recruitment of racial minorities improving in NSF ATE projects. • NSF ATE projects neglect student advising & other strategies to retain students

  3. *Data from Program Improvement Projects in Western Michigan State annual ATE Survey by www.evalu-ate.org

  4. Data from Program Improvement Projects in Western Michigan State annual ATE Survey: www.evalu-ate.org

  5. Data from Program Improvement Projects in Western Michigan State annual ATE Survey: www.evalu-ate.org

  6. Advanced Technology Programs (ATP) fail to Attract Women. Data graphed are First-term Enrollments by Gender for ATP & Non-ATP Data are based on all students enrolled in Connecticut Community Colleges 1999-2009. (N=120,000)

  7. Overview of National Strategies to Improve Student Completions • Many organizations are trying to address the completion/success gap in 2-year colleges • Analytics movement attempting to forecast student dropouts • Whitehouse Committee on Measures of Student Success • Appointed in 2010 • Sept. 2011 interim report • April, 2012 target for preliminary report • Years before impact likely

  8. Other National Organizations Developing Policy to facilitate College Student Success • Common Completion Metrics (National Governors Assoc.) • Voluntary Framework of Accountability (AAAC) • Foundations of Excellence in the First College Year (Gardner Institute) • Complete College America • Achieve, Inc (35 State network) • Achieving the Dream (Database and Dashboards) • Western Interstate Commission for Higher Education (WICHE) – Human Capital Database Project • Gates Foundation - funded analytics initiatives • National Agenda for Analytics (EDUCAUSE)

  9. Analytics Projects in Higher Ed • Predictive Analytics (Capella U & others) • Data Analytics (Sinclair Community College) • Incisive Analytics (IncisiveAnalytics.com) • Platinum Analytics (AstraSchedule.com) • Action Analytics (Symposia in 2009 & 2010, and EDUCAUSE in 2011) • Learning Analytics (1st International Conference on Learning Analytics, Feb. 27, 2011) • Student Success Analytics (Purdue U., etc.)

  10. The Problem with Current Analytics • Analytics is sometime used as synonymous with ‘analysis’ to sound impressive. • More precisely, ‘analytics’ refers to ‘predictive analytics,’ or analysis of trend data to predict future events of individuals or populations. • Current analytics does not follow individual course-taking histories across time, thus it is weak in providing individualized information that students can use.

  11. Typical Analytics Data: Trend Line, not a Trajectory (Trend lines fail to give any information about change in individual attributes overtime, only aggregates.)

  12. Cohorts Showing Student Trajectories for 120,000 student histories in Conn.

  13. Student-Pathway Trajectories showing Race Gaps Data are all 2,407 students first enrolled Fall, 2005 in the Community College of Rhode Island system. Completion is defined as graduation, articulation, or completion of 48+ credits within 7 terms (4.5 years).

  14. The Solution: Student-Pathway Analytics • Recent, dynamic microsimulation techniques make it possible to follow individual course-taking histories (trajectories) across time • Thus, using student transcript data records, models can be built that simulate student enrollment decisions term by term.. • The results give information that students and student advisors can use to greatly improve their chances of completing a program successfully.

  15. MicroCC (Microsimulation for Community College [CC] students) • Microsimulation model developed in Modgen programming language from Statistics Canada • Hundreds of thousands of student transcript records from the CCs of Connecticut and Rhode Island were used as test data sets. • For any given set of data, each scenario simulation is repeated for an equivalent sample of 5 million students to eliminate random variability, which only takes about 2-3 minutes. MicroCC developed with Targeted Research funds from NSF ATE program.

  16. Features of MicroCC Current Model • Initial model includes 4 student choices or behaviors (details on next slide) • Model’s core (predictive factors) are derived from data at hand • 28 separate logistic (and ordered logit) regression models run to calculate coefficients for each factor and interaction that predicts success or completion • Multiple scenarios can be simulated by modifying either • starting populations (mostly demographic factors) • Gender, race, age, and initial full-/part-time status • effect coefficients for student decisions, or

  17. Process Decision Points: MicroCC Completes this Decision Sequence for each term of each Student 1) Enrollment /re-enrollment choice in each term 3) Number of courses attempted 2) Full vs Part Time enrollment in each term 4) Successful completion of each course attempted

  18. Measures of Success: Some Indicators Used • Success = completion of program (graduate, certificate, successful transfer, or completion of a required number of courses) • Total courses completed = completion of 12 or more courses within 10 terms (5 years)

  19. Selected Process Indicators • Momentum Point One Passed - student completed 3 courses in first term • Momentum Point Two Passed - student completed 6 courses in year one • Stopout - student temporarily does not enroll in term X • Stopouts -total terms student stopped out

  20. Background Indicators • Used in MicroCC • Gender (M/F) • Race (W/B/L/O) • Age (to 21/22+) • Starting term enrollment full-time vs part-time • Data not available in 2010 for MicroCC model • Financial aid in term X • Concurrent job • Marital status • Prior postsecondary education

  21. Issues that emerged from Processing Student Records • Data Restructuring – Creation of longitudinal file from term-level files can be done but it is time consuming. • Missing Data– Records on transfer status, graduations, and certificate completions may be incomplete or nonexistent. • Summer Term Challenge – can summer credits be ignored completely because there are so few regular students enroll in summer terms, or should credits and courses completed during the summer, be added into the counts for the previous term? • Developmental Courses -- Developmental courses were tracked but institutions handled them differently. • Transfer credits-- Are they added to new credits, and if so, when? • Simultaneous enrollments -- In Connecticut we found many students enrolled in multiple colleges during a single term.

  22. Screen print from MicroCC with Student Success Model for Baseline scenario with RI and CT data

  23. The Data: Student-Pathway Analytics • Data for MicroCC microsimulations came from two State enrollment databases: • Rhode Island Community College – 5 annual cohorts with most analysis just on the 2,502 students first enrolled in Fall 2005 for 4.5 years • Connecticut Community College system – 276,469 students in 10 cohorts beginning Fall 1999 to 2009.

  24. Screen print from MicroCC with Student Pathways Models for Baseline scenario with RI and CT data Sample output table for student success rates by term

  25. Microsimulation results showing role of student decisions in closing 17% gap between white and Hispanic students in success ratesSource: RICC, 2005 cohort, N=2,502

  26. Advantages of MicroCC Analysis • Gaps in success can be deconstructed, identifying the student pathways that created specific portions of the gap. • These results have direct relevance for students and guidance counselors, toward improving success rates.

  27. Process Decision Points: MicroCC Completes this Decision Sequence for each term of each Student 1) Enrollment /re-enrollment choice in each term 3) Number of courses attempted 2) Full vs Part Time enrollment in each term 4) Successful completion of each course attempted

  28. The Remaining Findings apply to only Connecticut CC Students who were enrolled in Advanced Technology Programs (ATP) • Most (90%) CT students in ATPs were in engineering and manufacturing programs. The remainder were in IT, network, and misc. science and technology programs. • The 7,310 ATP enrollees in CT were only 6% of all CC students. • As shown in the next chart, ATP students has a 17% higher completion rate than non-ATP students.

  29. Student Decisions Explain the 17% Gap between Adv. Tech. Students and Others in Success Source: 7,310 ATE Students in Connecticut CCs 2000-2009

  30. Student Enrollment Decisions are not Magical • The amount of impact they have on success depends upon specific regions, schools, and curricular programs. • If a student enrolls full time plus works full time and has children to raise, s/he might not do well in coursework and thus not keep up the momentum toward completion.

  31. Student Enrollment Decisions are not a one time event, but they are repeated at each term until completion or drop out • But both students and their advisors need to understand how crucial these decisions are to pathway success: • 1. To enroll continuously – no stop outs • 2. To enroll full time • 3. To take the larger numbers of courses each term, within reason • 4. To pass the courses attempted. • The simulation model incorporates these decisions, not just at first enrollment, but at every term in which the student is enrolled.

  32. Other Examples Follow: • Remaining charts from microsimulations illustrate how student decisions influence different subgroups of students within ATP programs in CT. • Example 1, shows elements of gap between CT and ATP White and Hispanic men • Example 2, highlights the higher completion rates of women over men in CT ATPs

  33. Student Decisions Explain 8% Success Gap between White & Hispanic Males in Advanced Technology Programs Source: 7,310 ATE Students in Connecticut CCs 2000-2009

  34. Women Outpace Men in all Race Categories - Percent of Students Completing their Programs by Gender & by Race in Conn. N=7,310 ATE students

  35. Broader Capabilities of MicroCC • Microsimulation can uncover enrollment decisions that have huge effects on student success. • These student decisions can sometimes explain demographic differences. • Adding additional data, e.g., job history, financial aid and retention interventions, e.g., mentoring, as factors in the models, can make the methodology even more powerful. • Enrollment forecasting can be done with greater precision. • The model could also be extended to include post-schooling job trajectories as well. For More information contact Ron Anderson rea@umn.edu or 952-473-5910

  36. Implications for NSF ATE • The ATE program should invest in student tracking data systems, either in conjunction with existing student record systems or, better yet, a separate data system to which ATE-funded projects had to contribute. • ATE-funded projects should be encouraged or required to address and report on student advising practices. • Training should be developed for high school and community college student advisors regarding the needs of STEM students • Recruitment of women (with improved advising) into STEM pathways needs to be given greater priority

  37. Implications for NSF ATE Projects • NSF ATE projects may be neglecting student advising & related strategies to retain students. • Of the 305 projects and centers recently funded by the NSF ATE program, only two mentioned “student advising” or “guidance counseling” in their title or abstract. However, 10 projects (1%) mentioned “counselors.” • ATE projects could utilize the findings of MicroCC simulations as guides for student advising. A system for student progress coaching and advising is needed with every ATE funded project

  38. Hypothetical Intervention Study Results with a 15% success gap explained by 3 student decisions

  39. More Implications • Microsimulations should be run on many more States, college populations, and ATE program populations, so that findings could be tailored to specific groups of at-risk students. • Input data for simulations should be expanded to include job status, financial aid, and other items relevant to student success. • Microsimulation should be extended to include articulation and job acquisition processes.

More Related