1 / 42

Laboratory for innovation in digital technologies that

Laboratory for innovation in digital technologies that close the learning feedback loop using machine learning analytics and cognitive science maximize learning through inexpensive/ open access to content and measurable performance data c apitalize on a large and growing user base

bono
Download Presentation

Laboratory for innovation in digital technologies that

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Laboratory for innovation in digital technologies that • close the learning feedback loop using machine learning analytics and cognitive science • maximize learning throughinexpensive/open access to content and measurable performance data • capitalize on a large and growing user base • positively disrupt educational systems PK-20

  2. digital open publishing platform • founded at Rice University in 19991200 open textbooks/collections • 22,000educational Lego blocks • 40 languages • >1 million users per monthfrom 190 countries • STEM content used 100 million times since 2007

  3. some Connexions partners Siyavula high school science texts for South Africa Government of Vietnam developing new curriculum at 40 universities IEEE quality review of open materials(IEEEcnx.org) OpenStax College developing a library of free college textbooks

  4. USA student debt over $1,000,000,000,000 70% of college students forgo buying texts 78% of those students believe they will perform worse in the course 24% of students take fewer credit hours due to expense

  5. open college textbooks library of 25 free college texts high quality – editorial board ofNobelists, former directors of US NSF turn-key adoptable

  6. open college textbooks at 10% market share (US), each year will save 1.6M students $160M per year (10x ROI/year) 80 adoptions, 7000 students, $1M+ in savings already in Fall 2012 sustaining ecosystem of corporate partners positive disruption

  7. partners drive scaling and sustainability

  8. sustainability Cost to sustain OSC+PL long-term: $2.2m/year Market size for 25 books: 15.8 million students Assume $4 mission support fee captured per student for each adopted textbook Market share to guarantee sustainability: 3.5%

  9. ed tech hype “Thanks to the invention of projected images, books will soon be obsolete in schools. Scholars will soon be instructed through the eye.” - Thomas Edison

  10. ed tech potential data(massive, rich, personal) close the learning feedback loop

  11. cycles ofinnovation curriculum(re)design personalizedlearning pathways via machinelearninganalytics big data cognitive science research

  12. personalized learning adaptation • to each learner’s background, context, abilities, goals closed-loop • students and instructors as active explorers of a knowledge space • tools for instructors and students to monitor and track their progress cognitively informed • leverage latest findings from the science of learning

  13. personalized learning

  14. Quadbaseopen-sourceassessments database supporting infrastructure for assessment, interactivity, peer review interactivesimsLablets Focuspeer review system Videotutorialslectures

  15. Quadbaseopen-sourceassessments database machine learning algs community interactivesimsLablets Focuspeer review system Videotutorialslectures

  16. balance technology with cognitive science cognitive science team Elizabeth Marsh Duke Andrew Butler Duke Henry RoedigerWashU

  17. learning principles retrieval practice • retrieving information from memory is not a neutral event; rather it changes memory • “testing effect” is robust and replicable spacing • distributing practice over time produces better long-term retentionthan massing practice • “spacing effect” is extremely robust and replicable feedback • closes the learning feedback loop • must be timely data learners content

  18. learning analytics Goal:assessand track student learning progress by analyzing their interactions with content data(massive, rich, personal)

  19. learning analytics Goal: assess and track student learning progress by analyzing their interactions with content Classical approach – “knowledge engineering” • domain experts pore over content, assessments, data, tagging and building rules • fragile, expensive, not scalable, not transferable Modern approach – “machine learning” • learn directly from data • automatic • robust, inexpensive, scalable, transferable

  20. standard practice

  21. from grades to concepts students data • graded student responses to unlabeled problems • large matrix with entries: 1 (correct, white)0 (incorrect, black) standard practice • instructor’s “grade book” = sum/average over each column issues • how to infer concept understanding without problem-level metadata? • what if students solve different sets of problems (sparsity)? problems

  22. from grades to concepts students data • graded student responses to unlabeled problems • large/huge matrix with entries: 1 (correct, white)0 (incorrect, black) observations • data matrix is sparse(each student only completes a subset of problems) • each problem involves only a small number of “concepts” (low rank) problems

  23. SPARFA concepts students SPARFA = SPARseFactor Analysis students ~ + Ber problems

  24. SPARFA each problem’s intrinsic “difficulty” concepts students each problem involves a combination of a small number of key “concepts” students ~ + Ber problems each student’s mastery of each “concept”

  25. SPARFA students converts to 0/1(probitor logisticcoin fliptransformation) red = strong ability blue = weak ability ~ Ber problems estimate of each student’s ability to solve each problem(even unsolved problems)

  26. solving SPARFA students significant recent progress in relaxation-based optimization for sparse/low-rank problems • matrix based methods (SPARFA-M) • Bayesian methods (SPARFA-B) current work: scaling up to massive # students and problems • complexity scales ~ (#students) x (#problems) related work at Duke: Carin et al problems

  27. standard practice

  28. questions(w/ estimated inherent difficulty) concepts(w/ estimated student knowledge + vocab tags)

  29. dashboards • Student/Instructor dashboardto replace grade book • estimate and track studentconcept mastery, on individual and class basis • Feedback on individual problems(concepts involved, etc.) • Identify strong/weak areas, including what to watch out for when studying • Progress through the “course map” • Relative standing in class • Projected final grade/exam score • Automatic “concept map” • estimate problem difficulty and identify good/bad problems (curric. Design) • Suggest what content student(s) should study next (scheduling) • Detect cheating and gaming

  30. marketing approach Most educators/systems are reticent to changing their teaching methods wholesale or overnight Pragmatic 3-step plan for crossing the chasm: • for instructors: computer-graded homework replacementfor students: free, computer-graded practice problems • advanced learning analytics providing feedback to student and instructor • instructor/machine generated personalized learning paths

  31. practice widget ? Test your understanding • Phase 2 OSC textbooks will have integrated practice opportunities for students • valuable data for both OSC and machine learning

  32. beta testing ELEC301 Signals and Systems • homework replacement w/ cog sci(feedback, retrieval practice, repetition, spacing) • no machine learning based personalization or dashboards preliminary findings • better retention and transfer of knowledge on an end-of-semester assessment relative to standard practice • magnitude of the benefit was almost equivalent to one letter grade considering completely accurate use of knowledge (no partial credit) and about half of one letter grade considering giving credit for partial knowledge summary • OST > standard practice • effect size ≈ 1/2 to 1 letter grade

  33. ongoing experiments 2011-2013ECE courses STEMScopes(>1M K-12 students) validateOpenStax Tutor performance inspire and support new researchin cognitive science

  34. cog-sci research questions • Are the eigenstudent factors (in)variant across different domains? • ex: learning algebra vs. learning French • How do eigenstudent factors relate to standard psychological factors like cognitive ability, ability to abstract, memory, etc.? • What can SPARFA learn from multimodal data like time-on-task, etc.?…

  35. building a center-of-excellence in digital learning vast scope from pre-K to college and beyond from research to deployment vast scale10s of millions of users multidisciplinary machine learning cognitive science pedagogy

More Related