1 / 49

The Role of Learning Analytics: A personal Journey… 5/22/2013

The Role of Learning Analytics: A personal Journey… 5/22/2013. Oded Meyer Department of Mathematics and Statistics Georgetown University. Some background about myself…. Learning Analytics. Data driven approach for the purpose of understanding and improving

ata
Download Presentation

The Role of Learning Analytics: A personal Journey… 5/22/2013

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. The Role of Learning Analytics: A personal Journey… 5/22/2013 Oded Meyer Department of Mathematics and Statistics Georgetown University

  2. Some background about myself…..

  3. Learning Analytics • Data driven approach for the purpose • of understanding and improving • learning and the and the environment • in which it occurs.

  4. Carnegie MellonOpen Learning Initiative (OLI) Scientifically-based online learning environments based on the integration of technology and the science of learning with teaching. OLI is designed to simultaneously improve learning and facilitate learning research.

  5. The OLI Statistics Course

  6. Educational Mission of Funder (2002) (The William and Flora Hewlett Foundation) Provide open access to high quality post-secondary education and educational materials to those who otherwise would be excluded due to: Geographical constraints Financial difficulties Social barriers To meet this goal: A complete stand-alone web-based introductory statistics course. openly and freely available to individual learners online.

  7. Important form of analytics: The Science of Learning • General: • Make the structure and “big picture” salient

  8. Learning Science Principles - continued • Immediate and targeted feedback  students achieve desired level of performance faster.

  9. Science of Learning - continued • Discipline specific principles: • Hands-on activities • Use real data • De-emphasize calculations.

  10. Start Analytics Early in the Design process • After one module : • Qualitative feedback • Observe students • Talk-allowed protocols

  11. Feedback to the instructor about students’ learning  Learning Dashboard Presents the instructor with a measure of student learning for each learning objective. More detailed information: Class’s learning of sub-objectives Learning of individual students Common misconceptions Instructor feedback loop analytics

  12. Learning Dashboard Team led by Dr. Marsha Lovett

  13. Instructor feedback loop analytics • Key: • Clearly defined learning objectives. • Tying each activity to a learning objective. • Benefits: • Can reveal misconceptions • Impacts how your spend class time • Can reveal “expert blind spots”

  14. Students who achieved proficiency in finding median did poorly in the following assessment item tied to the same sub-skill:

  15. Larger Scale Analytics: Assessing the Effectiveness of the Course When used in the Blended (Hybrid) Mode

  16. Larger Scale Analytics: Assessing the Effectiveness of the Course • The Hewlett Foundation’s “Accelerate • Learning Challenge”: • Can students using the OLI course in the blended mode learn the same material as they would in the traditional course in shorter time and still have equal or better learning gains.

  17. Three accelerated Studies • #1 Small class, expert instructor (2007) • #2 Replication with larger class (2009) • With retention follow-up 4+ months later • #3 Replication with new instructor (2010) • Experienced statistics instructor • New to OLI Statistics course and hybrid mode

  18. Study 1: Method ~180 students enrolled 68 volunteers for special section 24 students, adaptive/ accelerated condition 44 students, traditional control condition

  19. Adaptive/Accelerated vs. Traditional < • Two 50-minute classes/wk • Eight weeks of instruction • Homework: complete OLI activities on a schedule • Tests: Three in-class exams, final exam, and CAOS test • Four 50-minute classes/wk • Fifteen weeks of instruction • Homework: read textbook & complete problem sets • Tests: Three in-class exams, final exam, and CAOS test < ? = Same content but different kind of instruction

  20. Dependent Measure • CAOS = Comprehensive Assessment of Outcomes in a First Statistics course (delMas, Garfield, Ooms, Chance, 2006) • Forty multiple-choice items measuring students’“conceptual understanding of important statistical ideas” • Content validity – positive evaluation by 18 content experts • Reliability – high internal consistency • Aligned with content of course (both sections) • Administered as a pre/posttest

  21. Study 1: CAOS Test Results • Adaptive/Accelerated group gained more (18% vs 3%)pre/post on CAOS than did Traditional Control, p < .01. Chance

  22. These analytics got the attention of education leaders in the U.S. who are facing the “cost disease” in higher ed. William Bowen (former President of Princeton) replicated our study in the “Interactive Learning Online at Public Universities” study. The study further indicated that blended learning offers the potential of more economical and rapid pathways to mastery.

  23. Analytics on Students’ Learning Habits • Students in both groups recruited to complete time-logs • Self-report for both groups • Analogous point in the course (2/3 through) • Six consecutive days: Wednesday - Monday

  24. Study 1: Time Spent Outside of Class • No significant difference between groups in the time students spent on Statistics outside of class

  25. Study 2: Replication & Extension • Same method, same procedure, same instructor • Larger class (52 students in Adaptive / Accelerated) • Follow-up retention study conducted 4+ months later

  26. Study 2: CAOS Test Results • Adaptive/Accelerated group gained more pre/post on CAOS than did Traditional Control, p < .01. Chance

  27. Using Analytics to Assess Retention Follow-up Begins Trad’l Ends Adapt/Acc Ends Jan Feb Mar Apr May Jun Jul Aug Sep Oct Adapt/Acc Delay (13 Students) Trad Delay (14 Students)

  28. Study 2, Retention: Re-taking CAOS • At 6-month delay, Adaptive/Accelerated group scored higher on CAOS than Traditional Control, p < .01. Chance

  29. Study 3: Further Replication & Extension • Same method, same procedure • New instructor • Not involved in development of OLI course • New to OLI statistics and hybrid teaching mode • Instructor held constant for both Adapt/Acc and Control conditions • Larger class (40 students in Adaptive / Accelerated)

  30. Study 3: CAOS Test Results Chance Adaptive/Accelerated group gained more pre/post on CAOS than did Traditional Control, p < .01.

  31. Current and Future Analytics • Continued “gap analysis”. • Better alignment between learning objectives/sub-objectives and activities. • Student Dashboard  provide learners with insight into their own learning habits and can give recommendations for improvement. • Learning-facing analytics  allows learners to compare their own performance against an anonymous summary if their course peers

  32. Lessons Learned… • Pedagogy must drive technology and not the other way. • 2. Developing online materials is a collaborative effort. • 3. Developing online materials is an iterative process. • 4. Steep learning curve. • 5. Growth as a teacher.

  33. “Improvement in post-secondary education will require converting teaching from a ‘solo sport’ to a community-based research activity” Herbert Simon, Last Lecture Series, Carnegie Mellon, 1998

More Related