1 / 50

An introduction to intelligent interactive instructional systems Kurt VanLehn ASU

An introduction to intelligent interactive instructional systems Kurt VanLehn ASU. Outline. Tutoring systems. Step loop User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations

Download Presentation

An introduction to intelligent interactive instructional systems Kurt VanLehn ASU

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. An introduction to intelligent interactive instructional systems Kurt VanLehn ASU

  2. Outline Tutoring systems • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination Other interactive instructional systems

  3. Intelligent “tutoring” system is a misnomer • Almost all are used as seatwork/homework coaches • The instructor still… • Lectures • Leads whole class, small group & lab activities • Assigns grades; defends grades • Can assign homework / seatwork problems • or delegate to the tutoring system • The instructor no longer… • Grades homework / seatwork • Tests? For-profit web-based homework grading services are growing rapidly

  4. If students enter only the answer, call it answer-based tutoring 30° 25 X = 40° y° y° x° Answer 45° What is the value of x?

  5. 30° 40° y° y° x° 45° What is the value of x? If students enter steps that derive the answer, call it step-based tutoring Step Step Step Step Step Step Answer Step

  6. 30° 40° y° y° x° 45° What is the value of x? OK Def: Feedback is a comment on one of the student’s steps Oops! Check your arithmetic.

  7. 30° 40° y° y° x° 45° What is the value of x? OK Feedback is often given as a hint sequence Oops! Check your arithmetic.

  8. 30° 40° y° y° x° 45° What is the value of x? OK Hints become more specific You seem to have made a sign error.

  9. 30° 40° y° y° x° 45° What is the value of x? OK Hints segue from commenting on the student’s step to suggesting a better step Try taking a smaller step.

  10. 30° 40° y° y° x° 45° What is the value of x? OK and become more specific Try doing just one arithmetic operation per step.

  11. 30° 40° y° y° x° 45° What is the value of x? OK Def: A bottom-out hint is the last hint, which tells the student what to enter. Enter 70+y=180, and keep going from there.

  12. 30° 40° y° y° Try doing just one arithmetic operation per step. x° 45° What is the value of x? OK Def: A next step help request is another way to start up a hint sequence.

  13. 30° 40° y° y° x° 45° What is the value of x? Delayed (as opposed to immediate) feedback occurs when the solution is submitted

  14. 30° 40° y° y° Oops! Check your arithmetic. x° 45° What is the value of x? OK OK Delayed (as opposed to immediate) feedback occurs when the solution is submitted Can an angle measure be negative?

  15. Both step-based tutors and answer-based tutors have a task loop • Tutor and/or student select a task • Tutor poses it to the student • Student does the task and submits an answer • If answer-based tutor, then work offline • If step-based tutor, then work online • The step-loop = Do step; get feedback/hints; repeat • Repeat

  16. Technical terms/concepts (so far) • Answer-based tutoring system (= CAI, CBI, …) • Step-based tutoring system (= ITS, ICAI…) • Step • Next-step help • Feedback • Immediate • Delayed • Hint sequence • Bottom-out hint • Task loop • Step loop

  17. Andes user interface Read a physics problem Draw vectors Type in equations Type in answer

  18. Andes feedback and hints “What should I do next?” “What’s wrong with that?” Green means correctRed means incorrect Dialogue & hints

  19. SLQ-Tutor (Addison Wesley) Problem Step Step Step Submit! Feedback The database that the problem refers to

  20. Cognitive Algebra I Tutor (Carnegie Learning) Step: Enter an equation Problem Step: Divide both sides Step: Label a column Step: Define an axis Step: Fill in a cell Step: Plot a point

  21. The task Student input is the 2nd half of the step Each tutor turn + student turn in the dialogue is a step AutoTutor

  22. Introduction: Summary • Main ideas • Task loop over tasks • Step loop over steps of a task • Feedback can be immediate or delayed • But it focuses on steps • Hint sequence • Types of tutoring systems • Step-based tutors (ITS) – both loops • Answer-based tutors (CBT, CAI, etc) – task loop only

  23. Initial framework • Step loop • User interface • Interpreting student actions • Suggesting good actions • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  24. Initial framework • Step loop • User interface • Forms, with boxes to be filled • Dialogue • Simulation • Etc. • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  25. Initial framework • Step loop • User interface • Interpreting student steps • Equations • Typed natural language • Actions in a simulation • Etc. • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  26. Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Any correct path vs. shortest path to answer • Which steps can be skipped? • Recognize the student’s plan and suggest its next step? • Etc. • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  27. Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Give a hint before the student attempts a step? • Immediate vs. delayed feedback? feedback on request? • How long a hint sequence? When to bottom out immediately? • Etc. • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  28. Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Keeping the student in the “zone of proximal development” (ZPD) • Mastery learning: Keep giving similar tasks until student master them • Choosing a task that suits the learner’s style/attributes • Etc. • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  29. Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  30. Assessment vs. Evaluation • “Assessment” of students • What does the student know? • How motivated/interested is the student? • “Evaluation” of instructional treatments • Was the treatment implemented as intended? • Did it produce learning gains in most students? • Did it produce motivation gains in most students? • What is the time cost? Other costs?

  31. Assessment consists of fitting a model to data about the student • Single factor model: A single number representing competence/knowledge • Probability of a correct answer on a test item =f(competence(student), difficulty(item)) • Knowledge component model: One number per knowledge component representing its mastery • Probability of a correct answer on a test item =f(mastery(KC1), mastery(KC2), mastery(KC3), …) where KCn are the ones applied in a correct solution

  32. Example: Answer-based assessment of algebraic equation solving skill • Test item: Solve 3+2x=10 for x • KC5: Subtract from both sides & simplify3+2x=10  2x=7 • KC8: Divide both sides & simplify2x=7  x=3.5 • Single factor model • If answer is correct, increment competence else decrement • Knowledge component model • If answer correct, increment mastery of KC5 & KC8 • If answer incorrect, decrement mastery of KC5 & KC8 • Weakest one is most likely to be the failure, so decrement it more

  33. Step-based assessment of algebraic equation solving skill • Solve 3+2x=10 for x • Step1: • Step2: • Single factor model: • Whenever a step is answered correctly without hints, increment competence else decrement • Knowledge component model: • Whenever a step is answered correctly without hints,increment its KC’s mastery else decrement 2x = 7 x = 3.5

  34. Task selection uses assessments • Single factor model • Choose a task that is the right level of difficulty i.e.,in the ZPD (zone of proximal development) of the student • Knowledge component model • Choose a task whose solution uses mostly mastered KCs, and only a few KCs that need to be mastered

  35. Other assessment issues • Other decisions, besides task selection, that can use assessment? • Assessment of motivation or interest? • Assessment of learning styles? Disabilities? • Diagnosis of misconceptions? Bugs?

  36. Should a “Skillometer” displays knowledge component mastery to the student?

  37. Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  38. Authoring • Author creates new tasks • Author generates all solutions? • System generates all solutions? • Same taste as author? • Can author add new problem-solving knowledge? • Who can be an author? • Instructors? • Professional authors? • Knowledge engineers?

  39. Software architecture & engineering • Client-server issues • Platform independence • Integration with learning management systems • E.g., Blackboard, WebAssign, many others • Cheating, privacy • Quality assurance • Software bugs • Content & pedagogy bugs

  40. Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  41. Types of evaluations • Analyses of expert human tutors • What do they do that the system should emulate? • Formative evaluation • What behaviors of the system need to be fixed? • Have students talk aloud, interviews; teachers… • Summative evaluation • Is the system more effective than what it replaces? • Two condition experiment: System vs. control/baseline • Pre-test and post-test (+ other assessments) • Hypothesis testing • Why is the system effective? • Multi-condition experiments: System ±feature(s)

  42. Example: Summative evaluation of the Andes Physics tutor • University physics (mechanics) 1 semester • 2 Conditions: Homework done with… • Andes physics tutor • Pencil & paper • Same teachers (sometimes), text, exams, labs • Results (2000-2003) in terms of effect sizes • Experimenter’s post-test: d=1.2 • Final exam: d=0.3 • d = (mean_Andes_score – mean_control_score) ÷ pooled_standard_deviation

  43. Aptitude-treatment interaction (ATI)

  44. Open-response, problem solving exams scores Andes Control Exam score Grade-point average

  45. Ideal tutoring system adapts to the student’s needs High Bored, & irritated Assistance provided Large learning gains Struggling Low Assistance needed Assistance provided = task selection, feedback, hints, user interface…

  46. No-so-good tutoring system helps only some students High Not-so-good tutor Assistance provided Bored, & irritated Large learning gains Struggling Low Assistance needed Assistance provided = task selection, feedback, hints, user interface…

  47. Next Initial framework • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination

  48. Dissemination = getting the system into widespread use • Routes • Post and hope • Open source • Commercialization • Issues • Instructor acceptance • Instructor training • Student acceptance • Marketing

  49. Next Outline Tutoring systems • Step loop • User interface • Interpreting student steps • Suggesting good steps • Feedback and hints • Task selection • Assessment • Authoring and the software architecture • Evaluations • Dissemination Other interactive instructional systems

  50. Other intelligent interactive instructional systems • Teachable agent • Student deliberately teaches the system, which is then assessed (in public) • Learning companion • Student works while system encourages • Peer learner • Student and system work & learn together • To be discovered…

More Related