1 / 57

Intelligent Computer-Aided Instruction: A Survey Organized Around System Components

Intelligent Computer-Aided Instruction: A Survey Organized Around System Components. Author : Jeff W. Rickel, 1989 Speaker : Amy Davis CSCE 976 (Advanced AI) April 29 th , 2002. Outline of Presentation. Why ICAI? Overview of main systems and technologies discussed in this paper

arien
Download Presentation

Intelligent Computer-Aided Instruction: A Survey Organized Around System Components

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Intelligent Computer-Aided Instruction: A Survey Organized Around System Components Author: Jeff W. Rickel, 1989 Speaker: Amy Davis CSCE 976 (Advanced AI) April 29th, 2002 April 29th, 2002

  2. Outline of Presentation • Why ICAI? • Overview of main systems and technologies discussed in this paper • Contributions of seminal systems to various components of ICAI systems April 29th, 2002

  3. ICAI – better than CAI • First came CAI • Fully specifies presentations • All questions and their answers • Strict flow of control • “electronic page-turning” • Need for Intelligence recognized • Rich domain knowledge (and representation) • Ability to use knowledge in unspecified ways • Individualize instruction for student April 29th, 2002

  4. ICAI – Representative of AI • No commercial ICAI systems exist • ICAI – an active research topic in AI • ICAI employs many AI techniques • Require reasoning from rich knowledge representation • Models user • Needs communication and information structures • Needs “common sense” reasoning April 29th, 2002

  5. ICAI systems (I) • WEST (R. R. Burton and J.S. Brown, 1982) • “Conquer the west” with mathematical equations that evaluate to the number of spaces you want to move. • SCHOLAR (Jaime Carbonell, 1970) • Learn geography by holding natural-language dialog with the computer. April 29th, 2002

  6. ICAI systems (II) • WHY (Stevens and Collins, 1977) • Understand rainfall, when and why it happens by holding a discussion with the computer. • SOPHIE (Sleeman and Brown, 1982) • Learn by example how to troubleshoot electronic circuits. April 29th, 2002

  7. ICAI systems (III) • STEAMER (Hollan, Hutchins and Weitzman, 1984) • Manipulate controls to a steam propulsion system to gain an understanding of how each control effects the system. • RBT: Recovery Boiler Tutor (Woolf, 1986) • Solve problems in real time on a simulated boiler. April 29th, 2002

  8. ICAI systems (IV) • WUMPUS (Goldstein, 1978) • “Hunt the Wumpus” using mathematical and logical skills • MYCIN, GUIDON • Find the likely bacterial cause for the symptoms provided. April 29th, 2002

  9. ICAI Goals • More effective computer-based tutors • More economical computer-based tutors • Reflect current state of AI research April 29th, 2002

  10. Components of ICAI systems • Learning Scenarios • Forms of Knowledge Representation • Student modeling • Student diagnosis • Pedagogical knowledge • Discourse management • Automatic problem generation • User Interfaces April 29th, 2002

  11. ICAI Learning Scenarios • Goal: Involve more senses • Retain information longer • Make student an active participant • Methods • Coaching • Socratic • Mixed-Initiative • Dialogue • Articulate Expert • Simulation • Discovery Learning April 29th, 2002

  12. Learning Scenarios: Coaching • Only give advice when needed • Coach “looks over student’s shoulder” • Offer timely but unobtrusive advice • Expose key knowledge when student’s performance plateaus • Like MS help • Common in Gaming environment (ex. WEST) • Determine if student is using correct skills • Determine when student needs guidance April 29th, 2002

  13. Learning Scenarios: Mixed Initiative Dialog • Hold conversation with student • Student responds to computer questions OR • Student initiates a line of questioning and computer answers • SCHOLAR • More reactive to student • Allows student initiative April 29th, 2002

  14. Learning Scenarios: Socratic “Education can not be attained through passive exercises such as reading or listening, but instead from actual problem solving” • Ask thought-probing questions • Require use of new knowledge • Point out gaps in knowledge • Expose misconceptions • WHY tutor April 29th, 2002

  15. Learning Scenarios: Articulate Expert • SOPHIE • Teach by example • Solve problems with student watching • Explain reasons for decisions • Demonstrate troubleshooting tactics • Then make student solve problems • Occasionally provide guidance • Force student to give rationale for choices • Students should know “Why am I doing this action?” April 29th, 2002

  16. Learning Scenarios: Interactive, Inspectable Simulation • Provide a simulation of a domain • Allow exploration of actions • See the effects of actions • No fear for real-world consequences • Potential to carry into real-life situations • STEAMER, RBT April 29th, 2002

  17. Learning Scenarios: Discovery-Based Learning • Opposite of CAI • Student explores • Micro-world emulation • Discover rules and knowledge • Full student control – driven by curiosity • Prepares student for scientific inquiry, real life research, creative thinking • Outside scope of this paper April 29th, 2002

  18. Learning Scenarios:Summary • Determines “Look and Feel” of tutoring system. • Based on student-tutor balance of control • Requires support from the Knowledge base of the system April 29th, 2002

  19. ICAI Domain Knowledge Representation • CAI: poor knowledge of their domain • Canned presentation • Canned questions • Canned answers • ICAI: More knowledge  fewer limitations • Support understanding • Allow flexibility in teaching • Knowledge is key to intelligent behavior • Way knowledge is stored dictates its use April 29th, 2002

  20. Domain Knowledge • No general form suitable for all knowledge • Challenge: • Determine types of knowledge required • Find suitable representations • Support teaching particular subjects • Forms examined • Rule Based • Script • Semantic Network • Simulation • Condition Action Rules April 29th, 2002

  21. Domain Knowledge: Rule-Based KR • Generally a failure • Miss low-level detail • Miss relations necessary for learning and tutoring • No analogies, multiple views • No levels of explanation • Need to know how rules fit together • MYCIN, GUIDON • Need knowledge + perspective to communicate knowledge to student April 29th, 2002

  22. Domain Knowledge: Scripts • WHY • Nodes  processes, events; • Edges  relations between nodes • X enables Y • X causesY • Script  partially-ordered sequence of processes and events linked by temporal or causal connections. • Hierarchy of scripts: lower levels describe causal relationships within higher levels. April 29th, 2002

  23. Domain Knowledge: Semantic Networks • Highly structured data base • Stores concepts and facts • Stores connections along many dimensions • Embeds linguistic information • Avoids storing redundant information through use of many connections • Use data base to generate questions • Common in other disciplines of AI April 29th, 2002

  24. Domain Knowledge: Simulation • STEAMER: • Mathematically simulate the steam propulsion system • Tie graphics to the simulation • SOPHIE • Propagates constraints to explain why a behavior is caused April 29th, 2002

  25. Domain Knowledge:Condition/action rules • Popular in AI • Model of human intelligence (?) • “Recognize a condition, initiate an action” • Attractive because rules are modular April 29th, 2002

  26. Domain Knowledge:Summary • One representation doesn’t work for everything. • Often need multiple representations within one problem, WHY • Must be determined by how knowledge is to be used April 29th, 2002

  27. ICAI Student Modeling • Goal: Know what the student knows • CAI: Keep a tally of correct and incorrect answers • Little adaptation to student • Methods: • Overlay modeling (Goldstein, 1977) • Buggy modeling (R. R. Burton, 1982) April 29th, 2002

  28. Student Modeling:Overlay • Represent student knowledge as some function of the teacher’s knowledge. • Allows comparison between what student knows and what student should know. • WEST, SCHOLAR, WUMPUS April 29th, 2002

  29. Student Modeling:Buggy Modeling • Include both “buggy” and correct rules which the student may be following • Allows students error to be understood • May require enumeration of all possible errors! April 29th, 2002

  30. Student Modeling:Summary • Student Modeling still very open-ended • A full discussion is beyond scope of paper • Allows computer to find reasons behind student errors – student diagnosis. April 29th, 2002

  31. ICAI Student Diagnosis • Goal: Allow student to make mistakes, capitalize on them for better learning. • Methods: • Differential modeling • Direct interpretation • Plan recognition (buggy model) • Error taxonomy April 29th, 2002

  32. Student Diagnosis:Differential Modeling • Like overlay modeling: View a student error as a shortcoming that is detected with comparison to the tutor’s knowledge. • WEST April 29th, 2002

  33. Student Diagnosis:Direct Interpretation • Remove constraints on question, until student’s answer becomes valid: • Example: “What is the capital of Texas?” “Madison” “Madison is the capital of Wisconsin.” • Reasons through a semantic net April 29th, 2002

  34. Student Diagnosis:Plan recognition • Buggy model: try to find path in the model, (correct or incorrect) leading to student’s answer • Plan recognition: finding the goals which underlie student actions • Similar to language parsing April 29th, 2002

  35. Student Diagnosis:Error Taxonomy • Classify errors into types • Example of categories: • Mission information • Lack of concept • Misfiled fact • Overgeneralization • SCHOLAR April 29th, 2002

  36. Student Diagnosis:Summary • Student diagnosis is not goal: teaching is • Most diagnosis can be made easier by asking a few more questions • Allowing student to discover own errors is more effective (Socratic) • “A little meaningful feedback goes a long way” April 29th, 2002

  37. ICAI Pedagogical Knowledge • Teachers need to know more than just their subject: they need to know how to teach. • Main problems • Lesson planning • Dealing with student errors • Production rules April 29th, 2002

  38. Pedagogy:Lesson Planning • Develop strategies for ordering topics • Decide how to present material • Decide balance of control between tutor and student April 29th, 2002

  39. Pedagogy:Dealing with student errors • Two big decisions: • Decide when to interrupt student • Decide what to say • Common ideologies: • Trap student into discovering error • Allow student to see consequences of actions • Redirect the student • Affirm correct choices April 29th, 2002

  40. Pedagogy:Summary • Just knowing the problem domain isn’t enough • Effective teachers have teaching “common sense” • Effective teachers respond to students April 29th, 2002

  41. ICAI Discourse Management • Goal: Flexibility in the tutorial discourse • CAI: Hard-code syllabus, sometimes with alternate paths • Methods: • Reactive • Incremental knowledge-building • Context dependent • Hierarchical planning April 29th, 2002

  42. Discourse Management:Reaction • “Allow responses and misconceptions of student to drive the dialog” • SCHOLAR, WHY • Have a few initial goals (WHY), and modify them as session proceeds April 29th, 2002

  43. Discourse Management:Incremental Building • Add on to student’s current knowledge • Further develop a strong base • Explore new topics • WUMPUS April 29th, 2002

  44. Discourse Management:Context Dependent • Use context to disambiguate questions, find answers • Context = Position, progress and current task of student • Object Oriented Tutoring incorporates this into a subject object April 29th, 2002

  45. Discourse Management:Hierarchical planning • PhD dissertation of Beverly Woolf, 1984 • Top-down refinement of goals • Domain independent April 29th, 2002

  46. Discourse Management:Summary • Discourse management requires knowledge • Knowledge needed not just in subject area • Authors vary in opinion of how much flexibility is best. April 29th, 2002

  47. ICAI Problem Generation • CAI: canned problems, canned answers • Hard for course author • No adaptation to student • Limited meaningful feedback • Generative CAI: programs generate new problems • Methods: • Problem-generation trees • Slot filling April 29th, 2002

  48. Problem Generation:Trees • Concept tree: • Student is at a level in the tree • Tree determines what to include in question • Use context-free grammar to form actual question April 29th, 2002

  49. Problem Generation:Slot filling • Choose a kind of problem • Example: fill-in-the-blank, multiple choice • Fill in information to problem from information in semantic net • Requires rich knowledge base April 29th, 2002

  50. Problem Generation:Summary • Tree-like structures are used for generating problems • Problems that are generated must also be solved April 29th, 2002

More Related