building a predictive model a behind the scenes look n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Building A Predictive Model A Behind the Scenes Look PowerPoint Presentation
Download Presentation
Building A Predictive Model A Behind the Scenes Look

Loading in 2 Seconds...

play fullscreen
1 / 59

Building A Predictive Model A Behind the Scenes Look - PowerPoint PPT Presentation


  • 63 Views
  • Uploaded on

Building A Predictive Model A Behind the Scenes Look. Mike Sharkey Director of Academic Analytics, The Apollo Group January 9, 2012. The 50,000 ft. View. We have lots of data; we need to set a good foundation…. …so we can extract information that will help our students succeed.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Building A Predictive Model A Behind the Scenes Look' - rob


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
building a predictive model a behind the scenes look

Building A Predictive ModelA Behind the Scenes Look

Mike Sharkey

Director of Academic Analytics, The Apollo Group

January 9, 2012

the 50 000 ft view
The 50,000 ft. View

We have lots of data;

we need to set a

good foundation…

…so we can extract information that will help

our students succeed

integrated data warehouse
Integrated Data Warehouse

Applicant

Integrated

Data

Repository

SIS

Reporting

Tools

Applications

LMS

Analytics

Tools

Databases

CMS

Business

Intelligence

how is it working
HOW IS IT WORKING?
  • Continuous flow of integrated data
  • Can drill down to the transaction level
  • New data flows require in-demand resources
  • Need skilled staff to understand the data model

Advantages

Disadvantages

predicting success but what is success
PREDICTING SUCCESS… …BUT WHAT IS SUCCESS?

Learning

Program persistence

Course completion

?

Did the students learn what they were supposed to learn?

Student

drops out

Student

passes class

the plan
The Plan
  • Use available data to build a model (logistic regression)
    • Demographics, schedule, course history, assignments
  • Develop a model to predict course pass/fail
    • e.g. scale of 1-10
      • 10 will likely pass the course
      • 1 will most likely fail the course
  • Feed the score to academic counselors who can intervene (phone at-risk students)
the model
The MODEL
  • Built different models
    • Associates, Bachelors, Masters
    • Predict at Week 0, Week 1, … to Week (last)
  • Strongest predictive coefficients
    • Course assignment scores (stronger as course goes on)
    • Financial status (mostly at Week 0)
    • Did the student fail courses in the past
    • Credits earned in the program (tenure)
where we are today
Where we are today
  • Validation
    • The statistics are sound, but we need to field test the intervention plan to validate the model scores
  • What we learned
    • The strongest parameters are the most obvious (assignments)
    • Weak parameters: gender, age, weekly attendance
  • Add future parameters as available
    • Class activity, participation, faculty alerts, inactive time between courses, interaction with faculty, orientation participation, late assignments
thank you

Thank YOU!

Mike Sharkey

mike.sharkey@phoenix.edu

602-557-3532

5 challenges in building deploying learning analytics solutions

5 Challenges in Building & Deploying Learning Analytics Solutions

Christopher Brooks (cab938@mail.usask.ca)

my biases
My biases
  • A domain of higher education
  • Scalable and broad solutions
  • The grey areas between research and production
question your biases what do you think the principal goal of learning analytics should be
Question: Your biases: what do you think the principal goal of Learning Analytics should be?
  • Enabling human intervention
  • Computer assisted instruction (dynamic content recommendation, tutoring, quizzing)
  • Conducting educational research
  • Administrative intelligence, transparency, competitiveness
  • Other (write in chat)
challenge 1 what are you building
Challenge 1: What are you building
  • Exploring data
    • Intuition and domain expertise are useful
    • Multiple perspectives from people familiar with the data
    • More data types (diversity) is better, smaller datasets (instances) is ok
    • Imprecision in data is ok
    • Visualization techniques
  • Answering a question
    • Data should be cleaned and rigorous, with error recognized explicitly
    • The quantity of data in the datasets (instances) strengthens the result
    • Decision makers must guide the process (are the questions worth answering?)
    • Statistical techniques
results validated quantified and encouraged more investigation
Results validated, quantified, and encouraged more investigation
  • Hypotheses
    • H1: There will be a group of minimal activity learners...
    • H2: There will be a group of high activity learners...
    • H3: There will be a group of disillusioned learners...
    • H4: There will be a group of deferred learners...
challenge 2 what to collect
Challenge 2: What to collect
  • Too much versus too little
    • Make a choice based on end goals
    • Think in terms of events instead of the “click stream”
    • Collecting “everything” comes with upfront development costs and analysis costs
      • The risk is the project never gets off the ground
      • Make hypotheses explicit in your team so they can decide how best to collect that data
  • Follow agile software development techniques (iterate & get constant feedback)
  • Build institutional will with small targeted gains
challenge 3 understand your user
Challenge 3: Understand your user

Breadth of Context

Administrator

Rates for degree completion, retention rate, re-enrolment rate,

number of active students...

(Abbreviated statistics)

Instructional Design/Researcher

Educational researcher, what works and what doesn't

tools and processes should change...

(Sophisticated statistics & visualizations)

Instructor

Evaluation of students, of a cohort of students, and

identifying immediate remediation...

(Visualization, Abbreviated statistics)

Student

Evaluation, evaluation, evaluation....

(Visualization)

with great power comes great responsibility
With great power comes great responsibility....
  • Some potential abuses of student tracking data
    • Changing pedagogical technique to the detriment of some students
    • Denying help to those who “aren't really trying”
    • A failure of instructors to acknowledge the challenges that face students

Is it ethical to give instructors access to student analytics data?

      • Yes
      • No
      • Sometimes

(write your thoughts in the chat)

challenge 4 acknowledge caveats
Challenge 4: Acknowledge Caveats
  • Analytics shows you a part of the picture only
    • Dead tree learning, in-person social constructivism, shoulder surfing/account sharing
    • Anonymization tools, javascript/flash blockers
    • False positives (incorrect amazon recommendations)
    • Misleading actions (incorrect self-assessment, or gaming the system (Baker))
  • Solutions
    • Aggregation & anonymization
    • Make error values explicit
    • Use broad categories for actionable analytics
does learner modelling offer solutions
Does learner modelling offer solutions?
  • Learner modelling community blends with analytics.
    • Open learner modelling (students can see their completed model)
    • Scruitable learner modelling (students can see how the system model of them is formed)

Question: I believe the student should have the right to view where analytics data about themselves has come from and who it has been made available to.

      • Yes
      • No
      • Sometimes

(and what are the implications on doing this? write in chat)

challenge 5 cross application boundaries
Challenge 5: Cross Application Boundaries
  • Data from different applications (clickers, lcms, lecture capture, SIS/CIS, publisher quizzes, etc.) doesn't play well together
    • Requires cleaning
    • Requires normalizing on semantics
    • Requires access
  • Data warehousing activities
  • Is there a light on the horizon?

http://www.flickr.com/photos/malikdhadha/5105818154/

quick conclusions
Christopher Brooks

Department of Computer Science

University of Saskatchewan

cab938@mail.usask.ca

Quick conclusions
  • Thus far I've learned it's important to:
    • Know your goals
    • Know your user
    • Capture what you know you need and don't worry about the rest
    • Acknowledge limitations of your approach
    • Iterate, iterate, iterate
learning analytics for c21 dispositions skills

Learning Analytics for C21 Dispositions & Skills

Simon Buckingham Shum

Knowledge Media Institute, Open U. UK

simon.buckinghamshum.net

@sbskmi

l a framework to think with5
L.A. framework to think with…

Focus of most LA effort

beginning to move towards these more complex spaces

l a framework to think with6
L.A. framework to think with…

Focus of most LA effort

beginning to move towards these more complex spaces

http://solaresearch.org/OpenLearningAnalytics.pdf

l a framework to think with7
L.A. framework to think with…

Focus of most LA effort

beginning to move towards these more complex spaces

critical for learner engagement, and authentic learning

learning analytics for this
Learning analytics for this?

“We are preparing students for jobs that do not exist yet, that will use technologies that have not been invented yet, in order to solve problems that are not even problems yet.”

“Shift Happens”http://shifthappens.wikispaces.com

learning analytics for this1
Learning analytics for this?

“The test of successful education is not the amount of knowledge that pupils take away from school, but their appetite to know and their capacity to learn.”

Sir Richard Livingstone, 1941

analytics for c21 skills learning how to learn authentic enquiry
analytics for… C21 skills?learning how to learn?authentic enquiry?

social capital critical questioning argumentation citizenship habits of mind resilience collaboration creativitymetacognitionidentityreadiness sensemakingengagement motivationemotional intelligence

38

l a framework to think with8
L.A. framework to think with…

More LA effort needed

e.g.

1. Disposition Analytics

2. Discourse Analytics

Focus of most LA effort

beginning to move towards these more complex spaces

slide41

ELLI: Effective Lifelong Learning InventoryWeb questionnaire 72 items (children and adult versions: used in schools, universities and workplace)

Buckingham Shum, S. and Deakin Crick, R (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning Analytics. Accepted to 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr – 2 May, 2012).

validated as loading onto 7 dimensions of learning power
Changing & Learning

Meaning Making

Critical Curiosity

Creativity

Learning Relationships

Strategic Awareness

Resilience

Being Stuck & Static

Data Accumulation

Passivity

Being Rule Bound

Isolation & Dependence

Being Robotic

Fragility & Dependence

Validated as loading onto 7 dimensions of “Learning Power”
elli generates a 7 dimensional spider diagram of how the learner sees themself
ELLI generates a 7-dimensional spider diagram of how the learner sees themself

Basis for a mentored-discussion on how the learner sees him/herself, and strategies for strengthening the profile

Bristol and Open University are now embedding ELLI in learning software.

43

enquiryblogger tuning wordpress as an elli based learning journal
EnquiryBlogger:Tuning Wordpress as an ELLI-based learning journal

Standard Wordpress editor

Categories from ELLI

Plugin visualizes blog categories, mirroring the ELLI spider

learningemergence net
LearningEmergence.net

more on analytics for learning to learn and authentic enquiry

discourse learning analytics
Discourse Learning Analytics

Effective learning conversations display some typical characteristics which learners can and should be helped to master

Learners’ written, online conversations can be analysed computationally for patterns signifying weaker and stronger forms of contribution

socio cultural discourse analysis mercer et al ou
Socio-cultural discourse analysis (Mercer et al, OU)
  • Disputational talk, characterised by disagreement and individualised decision making.
  • Cumulative talk, in which speakers build positively but uncritically on what the others have said.
  • Exploratory talk, in which partners engage critically but constructively with each other's ideas.

Mercer, N. (2004). Sociocultural discourse analysis: analysing classroom talk as a social mode of thinking. Journal of Applied Linguistics, 1(2), 137-168.

socio cultural discourse analysis mercer et al ou1
Socio-cultural discourse analysis (Mercer et al, OU)
  • Exploratory talk, in which partners engage critically but constructively with each other's ideas.
    • Statements and suggestions are offered for joint consideration.
    • These may be challenged and counter-challenged, but challenges are justified and alternative hypotheses are offered.
    • Partners all actively participate and opinions are sought and considered before decisions are jointly made.
    • Compared with the other two types, in Exploratory talk knowledge is made more publicly accountable and reasoning is more visible in the talk.

Mercer, N. (2004). Sociocultural discourse analysis: analysing classroom talk as a social mode of thinking. Journal of Applied Linguistics, 1(2), 137-168.

analytics for identifying exploratory talk
Analytics for identifying Exploratory talk

Elluminate sessions can be very long – lasting for hours or even covering days of a conference

It would be useful if we could identify where quality learning conversations seem to be taking place, so we can recommend those sessions, and not have to sit through online chat about virtual biscuits

Ferguson, R. and Buckingham Shum, S. Learning analytics to identify exploratory dialogue within synchronous text chat.1st International Conference on Learning Analytics & Knowledge (Banff, Canada, 27 Mar-1 Apr, 2011)

slide55
KMi’s Cohere: a web deliberation platform enabling semantic social network and discourse network analytics

Rebecca is playing the role of broker, connecting 2 peers’ contributions in meaningful ways

De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M. and Cannavacciuolo, L. Discourse-centric learning analytics. 1st International Conference on Learning Analytics & Knowledge (Banff, 27 Mar-1 Apr, 2011)

discourse analysis
Discourse analysis

Xerox’s parser can detect the presence of ‘knowledge-level’ moves in text:

NOVELTY:

... new insights provide direct evidence ...

... we suggest a new ... approach ...

... results define a novel role ...

OPEN QUESTION:

… little is known …

… role … has been elusive

Current data is insufficient …

CONRASTING IDEAS:

… unorthodox view resolves … paradoxes …

In contrast with previous hypotheses ...

... inconsistent with past findings ...

SIGNIFICANCE:

studies ... have provided important advances

Knowledge ... is crucial for ... understanding

valuable information ... from studies

SUMMARIZING:

The goal of this study ...

Here, we show ...

Altogether, our results ... indicate

GENERALIZING:

... emerging as a promising approach

Our understanding ... has grown exponentially ...

... growing recognition of the

importance ...

SURPRISE:

We have recently observed ... surprisingly

We have identified ... unusual

The recent discovery ... suggests intriguing roles

Ágnes Sándor & OLnet Project:http://olnet.org/node/512

BACKGROUND KNOWLEDGE:

Recent studies indicate …

… the previously proposed …

… is universally accepted ...

De Liddo, A., Sándor, Á. and Buckingham Shum, S. (In Press). Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study. Computer Supported Cooperative Work Journal

next steps
Next steps

SOCIAL LEARNING ANALYTICS: Develop this framework to integrate social, discourse, disposition and other process-centric analytics

DISPOSITION ANALYTICS: Extend the capabilities of the ELLI ‘learning power’ platform using real-time analytics data from online learner activity

DISCOURSE ANALYTICS: human+machine annotation of written discourse and argument maps

in more detail
In more detail…

Social Learning Analytics

  • Buckingham Shum, S. and Ferguson, R. (2011). Social Learning Analytics. Available as: Technical Report KMI-11-01, Knowledge Media Institute, The Open University, UK. http://kmi.open.ac.uk/publications/techreport/kmi-11-01

Discourse Analytics

  • De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M. and Cannavacciuolo, L. (2011). Discourse-Centric Learning Analytics.1st International Conference on Learning Analytics & Knowledge (Banff, 27 Mar-1 Apr, 2011). Eprint: http://oro.open.ac.uk/25829
  • Ferguson, R. and Buckingham Shum, S. (2011). Learning Analytics to Identify Exploratory Dialogue Within Synchronous Text Chat.1st International Conference on Learning Analytics & Knowledge (Banff, Canada, 27 Mar-1 Apr, 2011). Eprint: http://oro.open.ac.uk/28955
  • De Liddo, A., Sándor, Á. and Buckingham Shum, S. (2012, In Press). Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation Study.Computer Supported Cooperative Work. DOI: 10.1007/s10606-011-9155-x. http://www.springerlink.com/content/23n1408l9g06v062

Disposition Analytics

  • Ferguson, R., Buckingham Shum, S. and Deakin Crick, R. (2011). EnquiryBlogger: Using Widgets to Support Awareness and Reflection in a PLE Setting. 1st Workshop on Awareness and Reflection in Personal Learning Environments, PLE Conference 2011, 11-13 July 2011, Southampton, UK. Eprint: http://oro.open.ac.uk/30598
  • Buckingham Shum, S. and Deakin Crick, R (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling, and Learning Analytics. Accepted to 2nd International Conference on Learning Analytics & Knowledge (Vancouver, 29 Apr – 2 May, 2012). Working draft under revision: http://projects.kmi.open.ac.uk/hyperdiscourse/docs/SBS-RDC-review.pdf
summary
Summary

More LA effort needed

We need analytics tuned to generic capacities which equip learners for novel challenges

Focus of most LA effort

mastery of core knowledge and skills in training is vital, but no longer sufficient