1 / 59

Building A Predictive Model A Behind the Scenes Look

Building A Predictive Model A Behind the Scenes Look. Mike Sharkey Director of Academic Analytics, The Apollo Group January 9, 2012. The 50,000 ft. View. We have lots of data; we need to set a good foundation…. …so we can extract information that will help our students succeed.

rob
Download Presentation

Building A Predictive Model A Behind the Scenes Look

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Building A Predictive ModelA Behind the Scenes Look Mike Sharkey Director of Academic Analytics, The Apollo Group January 9, 2012

  2. The 50,000 ft. View We have lots of data; we need to set a good foundation… …so we can extract information that will help our students succeed

  3. Our Data Foundation

  4. Integrated Data Warehouse Applicant Integrated Data Repository SIS Reporting Tools Applications LMS Analytics Tools Databases CMS Business Intelligence

  5. HOW IS IT WORKING? • Continuous flow of integrated data • Can drill down to the transaction level • New data flows require in-demand resources • Need skilled staff to understand the data model Advantages Disadvantages

  6. Building a Predictive Model

  7. PREDICTING SUCCESS… …BUT WHAT IS SUCCESS? Learning Program persistence Course completion ? Did the students learn what they were supposed to learn? Student drops out  Student passes class

  8. The Plan • Use available data to build a model (logistic regression) • Demographics, schedule, course history, assignments • Develop a model to predict course pass/fail • e.g. scale of 1-10 • 10 will likely pass the course • 1 will most likely fail the course • Feed the score to academic counselors who can intervene (phone at-risk students)

  9. The MODEL • Built different models • Associates, Bachelors, Masters • Predict at Week 0, Week 1, … to Week (last) • Strongest predictive coefficients • Course assignment scores (stronger as course goes on) • Financial status (mostly at Week 0) • Did the student fail courses in the past • Credits earned in the program (tenure)

  10. Where we are today • Validation • The statistics are sound, but we need to field test the intervention plan to validate the model scores • What we learned • The strongest parameters are the most obvious (assignments) • Weak parameters: gender, age, weekly attendance • Add future parameters as available • Class activity, participation, faculty alerts, inactive time between courses, interaction with faculty, orientation participation, late assignments

  11. Thank YOU! Mike Sharkey mike.sharkey@phoenix.edu 602-557-3532

  12. 5 Challenges in Building & Deploying Learning Analytics Solutions Christopher Brooks (cab938@mail.usask.ca)

  13. My biases • A domain of higher education • Scalable and broad solutions • The grey areas between research and production

  14. Question: Your biases: what do you think the principal goal of Learning Analytics should be? • Enabling human intervention • Computer assisted instruction (dynamic content recommendation, tutoring, quizzing) • Conducting educational research • Administrative intelligence, transparency, competitiveness • Other (write in chat)

  15. Challenge 1: What are you building • Exploring data • Intuition and domain expertise are useful • Multiple perspectives from people familiar with the data • More data types (diversity) is better, smaller datasets (instances) is ok • Imprecision in data is ok • Visualization techniques • Answering a question • Data should be cleaned and rigorous, with error recognized explicitly • The quantity of data in the datasets (instances) strengthens the result • Decision makers must guide the process (are the questions worth answering?) • Statistical techniques

  16. Case 1: How healthy is your classroom community (SNA)

  17. Case 2: Applying supervised learning techniques (clustering)

  18. Results validated, quantified, and encouraged more investigation • Hypotheses • H1: There will be a group of minimal activity learners... • H2: There will be a group of high activity learners... • H3: There will be a group of disillusioned learners... • H4: There will be a group of deferred learners...

  19. Challenge 2: What to collect • Too much versus too little • Make a choice based on end goals • Think in terms of events instead of the “click stream” • Collecting “everything” comes with upfront development costs and analysis costs • The risk is the project never gets off the ground • Make hypotheses explicit in your team so they can decide how best to collect that data • Follow agile software development techniques (iterate & get constant feedback) • Build institutional will with small targeted gains

  20. Challenge 3: Understand your user Breadth of Context Administrator Rates for degree completion, retention rate, re-enrolment rate, number of active students... (Abbreviated statistics) Instructional Design/Researcher Educational researcher, what works and what doesn't tools and processes should change... (Sophisticated statistics & visualizations) Instructor Evaluation of students, of a cohort of students, and identifying immediate remediation... (Visualization, Abbreviated statistics) Student Evaluation, evaluation, evaluation.... (Visualization)

  21. With great power comes great responsibility.... • Some potential abuses of student tracking data • Changing pedagogical technique to the detriment of some students • Denying help to those who “aren't really trying” • A failure of instructors to acknowledge the challenges that face students Is it ethical to give instructors access to student analytics data? • Yes • No • Sometimes (write your thoughts in the chat)

  22. Challenge 4: Acknowledge Caveats • Analytics shows you a part of the picture only • Dead tree learning, in-person social constructivism, shoulder surfing/account sharing • Anonymization tools, javascript/flash blockers • False positives (incorrect amazon recommendations) • Misleading actions (incorrect self-assessment, or gaming the system (Baker)) • Solutions • Aggregation & anonymization • Make error values explicit • Use broad categories for actionable analytics

  23. Does learner modelling offer solutions? • Learner modelling community blends with analytics. • Open learner modelling (students can see their completed model) • Scruitable learner modelling (students can see how the system model of them is formed) Question: I believe the student should have the right to view where analytics data about themselves has come from and who it has been made available to. • Yes • No • Sometimes (and what are the implications on doing this? write in chat)

  24. Challenge 5: Cross Application Boundaries • Data from different applications (clickers, lcms, lecture capture, SIS/CIS, publisher quizzes, etc.) doesn't play well together • Requires cleaning • Requires normalizing on semantics • Requires access • Data warehousing activities • Is there a light on the horizon? http://www.flickr.com/photos/malikdhadha/5105818154/

  25. Christopher Brooks Department of Computer Science University of Saskatchewan cab938@mail.usask.ca Quick conclusions • Thus far I've learned it's important to: • Know your goals • Know your user • Capture what you know you need and don't worry about the rest • Acknowledge limitations of your approach • Iterate, iterate, iterate

  26. Learning Analytics for C21 Dispositions & Skills Simon Buckingham Shum Knowledge Media Institute, Open U. UK simon.buckinghamshum.net @sbskmi

  27. L.A. framework to think with…

  28. L.A. framework to think with…

  29. L.A. framework to think with…

  30. L.A. framework to think with…

  31. L.A. framework to think with…

  32. L.A. framework to think with… Focus of most LA effort beginning to move towards these more complex spaces

  33. L.A. framework to think with… Focus of most LA effort beginning to move towards these more complex spaces http://solaresearch.org/OpenLearningAnalytics.pdf

  34. L.A. framework to think with… Focus of most LA effort beginning to move towards these more complex spaces critical for learner engagement, and authentic learning

  35. Learning analytics for this? “We are preparing students for jobs that do not exist yet, that will use technologies that have not been invented yet, in order to solve problems that are not even problems yet.” “Shift Happens”http://shifthappens.wikispaces.com

  36. Learning analytics for this? “The test of successful education is not the amount of knowledge that pupils take away from school, but their appetite to know and their capacity to learn.” Sir Richard Livingstone, 1941

  37. analytics for… C21 skills?learning how to learn?authentic enquiry? social capital critical questioning argumentation citizenship habits of mind resilience collaboration creativitymetacognitionidentityreadiness sensemakingengagement motivationemotional intelligence 38

  38. L.A. framework to think with… More LA effort needed e.g. 1. Disposition Analytics 2. Discourse Analytics Focus of most LA effort beginning to move towards these more complex spaces

  39. Analytics for learning dispositions

  40. ELLI: Effective Lifelong Learning InventoryWeb questionnaire 72 items (children and adult versions: used in schools, universities and workplace) Buckingham Shum, S. and Deakin Crick, R (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning Analytics. Accepted to 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr – 2 May, 2012).

  41. Changing & Learning Meaning Making Critical Curiosity Creativity Learning Relationships Strategic Awareness Resilience Being Stuck & Static Data Accumulation Passivity Being Rule Bound Isolation & Dependence Being Robotic Fragility & Dependence Validated as loading onto 7 dimensions of “Learning Power”

  42. ELLI generates a 7-dimensional spider diagram of how the learner sees themself Basis for a mentored-discussion on how the learner sees him/herself, and strategies for strengthening the profile Bristol and Open University are now embedding ELLI in learning software. 43

  43. Adding imagery to ELLI dimensions to connect with learner identity Milhouse

  44. ELLI generates cohort data for each dimension

  45. …drilling down on a specific dimension

  46. EnquiryBlogger:Tuning Wordpress as an ELLI-based learning journal Standard Wordpress editor Categories from ELLI Plugin visualizes blog categories, mirroring the ELLI spider

  47. EnquiryBlogger:Cohort Dashboard

  48. LearningEmergence.net more on analytics for learning to learn and authentic enquiry

  49. Analytics for learning conversations

More Related