1 / 40

Open-source LA and “what the student does”

Open-source LA and “what the student does”. Dr Phill Dawson, Monash Dr Tom Apperley , Melbourne. Structure. Until 3:30pm You will get a break I will talk a bit (background concepts) I will show some tools You will talk a lot You write an algorithm

chaela
Download Presentation

Open-source LA and “what the student does”

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Open-source LA and “what the student does” DrPhill Dawson, Monash Dr Tom Apperley, Melbourne

  2. Structure • Until 3:30pm • You will get a break • I will talk a bit (background concepts) • I will show some tools • You will talk a lot • You write an algorithm • You will probably argue about ethics

  3. Dr Phillip (Phill) Dawson • Lecturer in Learning and Teaching at Monash • Led a small grant in learning analytics • Interested in how academics make decisions

  4. Who are you? • LA researchers? • Educational designers? • LMS administrators? • University managers? • Faculty-based academics? • Academic developers? • Non-university?

  5. “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (SoLAR)

  6. What LA do you use currently in everyday teaching? (ie not research) • Nothing? • Reports? (eg ‘who has logged in?’; ‘who has submitted assignment one?’) • Dashboards? • Something else?

  7. “Who is struggling or not engaging with my course?”

  8. Free, modular, configurable extendable, open-source learning analytics block for teachers to identify students at risk

  9. “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learningand the environments in which it occurs” (SoLAR)

  10. Learning happens because of… • What the student is? • What the teacher does? • What the student does? Biggs 1999

  11. “learning is what the student does”

  12. Activity - pairs • What do students do to learn in your context? • Long list • Specific • Verb stems • Online and offline • Effective and ineffective • Deep and strategic

  13. Which of these can’t possibly be captured by LA? Which of these can easily be captured by LA?

  14. Flickr user sndrvhttp://www.flickr.com/photos/sndrv/4519088620/ CC-BY Sci-fi LA vs Real LA Flickr user dpapehttp://www.flickr.com/photos/dpape/2720632752/ CC-BY

  15. Typical Open-Source LA Tools • Gather data on student use of parts of LMS • No integration with Student Management Systems • Teacher dashboards • Reports • Some synthesis • Inconsistent design and language • At code + UI levels

  16. State of the Actual:Free/open/built-in LA • Open-source tools • Engagement Analytics (documentation, demo vid) • Gismo • Analytics and Recommendations • Vendor-supplied reports • (egDesire2Learn)

  17. Modular ‘indicator’ architecture(so you can make additional indicator plugins) Completion Facebook Downloads Attendance

  18. How does the assessment indicator work? • If an assignment is very late it is riskier than if it is just a little late • If an assignment is worth a greater percentage then it is riskier than if it is worth a small percentage • If an assignment is past its due date and not submitted then it is riskier than if it was submitted late

  19. for each assignment, quiz, lesson in the course whose due date has passed { daysLateWeighting = ((number of days late) - overdueGraceDays) / (overdueMaximumDays - overdueGraceDays) assessmentValueWeighting = (value of this task) / totalAssessmentValue if (daysLateWeighting > 1) { daysLateWeighting = 1 } else if (daysLateWeighting < 0) { daysLateWeighting = 0 } if (task was submitted late) { risk = risk + daysLateWeighting * assessmentValueWeighting * overdueSubmittedWeighting } else if (task was not submitted) { risk = risk + daysLateWeighting * assessmentValueWeighting * overdueNotSubmittedWeighting } }

  20. Activity: specify a new ‘indicator’ of learning in your context • What it does in one sentence • Procedure to give a number between 0% (no risk) and 100% (high risk) • Words? • Pictures? • Flowchart? • Algorithm? • What variables could we tweak? • How important is this indicator?

  21. “From our students’ point of view, the assessment always defines the actual curriculum” Ramsden 1992, p. 187

  22. “Students can, with difficulty, escape from the effects of poor teaching…” Boud 1995, p. 35

  23. “…they cannot (by definition if they want to graduate) escape the effects of poor assessment” Boud 1995, p. 35

  24. LA are only as good as curriculum • Training statistical LA on assessment or retention outcomes ≠ learning • Teacher needs to be in control of LA use and specify learning • LA makes it more difficult to “escape the effects of bad teaching”

  25. “whether through denial, pride, or ignorance, students who need help the most are least likely to request it” Martin & Arendale 1993 p. 2

  26. It’s the end of week 2 and student X hasn’t ever logged in. What do we do?

  27. How can we make follow-up effective? • Personal or robotic? • Paint a grim picture? • Refer on or see personally? • Specific guidance

  28. It’s the week of the census and modeling suggests student X is 70% likely to fail. What do we do?

  29. How can we make follow-up ethical? • Do students have a • Right to try (and fail?) • Right to give up • Right to be strategic • Will draconian measures lead to LMS-farming? • Student-identified triggers

  30. It’s week 10 and student X already has 60% of the course grade but hasn’t logged in for two weeks. What do we do?

  31. Discussion and close:the near-future for open-source learning analytics

  32. Extra time options • Live demonstration of Engagement Analytics tool • Further specifying an indicator into pseudocode for a developer • Develop strategies for following up students at risk • Discuss student views of analytics tools • Discuss open-source

  33. References and sources • SoLAR definition of LA • Memes are from Quickmeme • NetSpot Innovation Fund logo courtesy of NetSpot. (You should apply for an open-source development grant through them.) • Biggs, J. (1999). What the Student Does: teaching for enhanced learning. Higher Education Research & Development, 18(1), 57-75. doi: 10.1080/0729436990180105 • Boud, D. (1995). Assessment and learning: contradictory or complementary. In P. Knight (Ed.), Assessment for Learning in Higher Education (pp. 35-48). London: Kogan Page. • Ramsden, P. (1992). Learning to teach in higher education. London: Routledge. • Martin, D., & Arendale, D. (1993). Supplemental Instruction: Improving First-Year Student Success in High-Risk Courses The Freshman Year Experience: Monograph Series (2nd ed., Vol. 7). Columbia, SC: National Resource Center for the First Year Experience and Students in Transition, University of South Carolina.

More Related