1 / 77

POSTECH Dialog-Based Computer Assisted Language Learning System

POSTECH Dialog-Based Computer Assisted Language Learning System. Intelligent Software Lab. POSTECH Prof. Gary Geunbae Lee. Contents . Introduction Methods DB-CALL System Example-based Dialog Modeling Feedback Generation Translation Assistance Comprehension Assistance

keaira
Download Presentation

POSTECH Dialog-Based Computer Assisted Language Learning System

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. POSTECH Dialog-Based Computer Assisted Language Learning System Intelligent Software Lab. POSTECH Prof. Gary Geunbae Lee

  2. Contents • Introduction • Methods • DB-CALL System • Example-based Dialog Modeling • Feedback Generation • Translation Assistance • Comprehension Assistance • Language Learner Simulation • User Simulation • Grammar Error Simulation • Discussion

  3. RESEARCH BACKGROUND

  4. PREVIOUS WORKS ON DB-CALL • Let’s Go (CMU, 02-04) • Providing bus schdule information for CMU Non-native students • Adaptation the acoustic model and language model to non-native speakers • Edit-distance based corrective feedback

  5. PREVIOUS WORKS ON DB-CALL • SPELL (Edinburgh, 05) • Restourant Domain • Scenario-based virtual space • Incorporating mal-rules into the ASR grammar

  6. PREVIOUS WORKS ON DB-CALL • DEAL (KTH, 07) • Trade Domain • Finite State Network-based limited dialog management • When leaners get stuck, the system provides hints

  7. POSTECH DB-CALL System <parallel> <source>~~~~~~~</source> <target ESL Dialog Tutoring <parallel> <source>~~~~~</source> <target>~~~~~</target> </parallel> <Alignment Info> <s2t>~~~~~~~~</s2t> <t2s>~~~~~~~~</t2s> <composition>~</composition> <Additional> <url>~~~~~~</url> > Expression > Description … Tutor: ---------- User: ---------- Tutor: ---------- User: ---------- Tutor: ---------- User: ---------- Tutor: ---------- User: ---------- Tutor: ---------- Crawler >Korean EXP > English EXP … Try this expression + + User Input Description Extractor Parallel Sentence Extractor

  8. DB-CALL System

  9. 1. Example-based Dialog Modeling

  10. INTRODUCTION • Spoken Dialog System • Applications • Human-Robot Interface, Telematics, Tutoring, ...

  11. PROBLEM & GOAL • PROBLEM • How to determine the next system action • Knowledge-based approach • Plan recipe / ISU rule / Agenda • Data-driven approach • Statistical approach • Supervised Learning based on state approximation • Reinforcement Learning based on MDP/POMDP • Example-based approach • GOAL • To develop a simple and practical approach to dialog modeling for multi-domain dialog systems

  12. IDEA Turn #1 (Domain=Building_Guidance) Dialog Corpus USER: 회의 실 이 어디 지 ? [Dialog Act = WH-QUESTION] [Main Goal = SEARCH-LOC] [ROOM-TYPE =회의실] SYSTEM: 3층에 교수회의실, 2층에 대회의실, 소회의실이 있습니다. [System Action = inform(Floor)] Indexed by using semantic & discourse features Domain = Building_Guidance Dialog Act = WH-QUESTION Main Goal = SEARCH-LOC ROOM-TYPE=1 (filled), ROOM-NAME=0 (unfilled) LOC-FLOOR=0, PER-NAME=0, PER-TITLE=0 Previous Dialog Act = <s>, Previous Main Goal = <s> Discourse History Vector = [1,0,0,0,0] Lexico-semantic Pattern = ROOM_TYPE 이 어디 지 ? System Action = inform(Floor) Dialog Example Having the similar state Dialog State Space Lee et al., (2006), A Situation-based Dialogue Management using Dialogue Examples, IEEE ICASSP

  13. ALGORITHM • Query Generation • Making SQL statement using Discourse History and SLU results. • Example Search • Trying to search semantically close dialog examples in example DB given the current dialog state. • Example Selection • Selecting the best example to maximize the utterance similarity measure based on lexical and discourse information. Noisy Input (from ASR/SLU) Query Generation Discourse History Example Search Content DB Relaxation Strategy Example Selection Example DB System Template NLG

  14. EXPERIMENTAL RESULTS • Real user evaluation • 10 undergraduates • Evaluation Metric • STR (Success Turn Rate) • # of successful turns / # of total turns • TCR (Task Completion Rate) • # of successful dialogs / # of total dialogs • AvgUserTurn • Average user’s turn length per dialog Lee et al., (2009), Example-based Dialog Modelng for Practical Multi-domain Dialog Systems, SPECOM

  15. EXPERIMENTAL RESULTS Example match rate of each dialog system Lee et al., (2009), Example-based Dialog Modelng for Practical Multi-domain Dialog Systems, SPECOM

  16. ROBUST DIALOG MANAGEMENT • PROBLEM • How to overcome errors in the real world • ROBUST DIALOG MANAGEMENT • Error handling • Recovering ASR/SLU errors by interacting with the user at the conversational level • N-best support • Estimating the current state with uncertanity ASR SLU DM +ERROR +ERROR Noise reduction Adaptation N-best & lattice & CN Robust parsing Data-driven app. Error handling N-best support Lee et al., (2008), Robust management with n-best hypotheses using dialog examples and agenda, ACL

  17. GOAL & IDEA • To increase the robustness of EBDM with prior knowledge 1) Error Handling • AgendaHelp • S: Next, you can do the subtask 1) Asking the room's role, or 2)Asking the office phone number, or 3) Selecting the desired room for navigation. If the system knows what the user will do next Dynamic Help Generation FOCUS NODE LOCATION UtterHelp S: Next, you can say 1) “What is it?”, or 2) “What’s the phone number of [ROOM_NAME]?”, or 3) “ Let’s go there. ROOM ROLE OFFICE PHONE NUMBER GUIDE NEXT_TASK

  18. GOAL & IDEA • To increase the robustness of EBDM with prior knowledge 2) N-best support If the system knows which subtask will be more probable next Rescoring N-best hypotheses (h1~hn) h1 ROOM NAME h3 FLOOR LOCATION h2 OFFICE PHONE NUMBER h4

  19. ALGORITHM V6 w1 u1 s1 ASR w2 SLU u2 EBDM V2 From User s2 wn un V1 sn Focus Stack V1 Discourse Interpretation V2 V3 Argmax Example Argmax Node V4 ej* V6 V7 V6 V5 am* e1 e2 ek V3 V4 V6 V9 V8

  20. EXPERIMENT SET-UP • Simulated User Evaluation • Test set : 1000 simulated dialogs (<20 user turns) • Domain : Intelligent robot for building guidance • Using 5-best recognition hypotheses • Evaluation Metric • TCR • # of successful dialogs / # of total dialogs • AvgUserTurn • Average user’s turn length per dialog • AvgScore • 20 * TCR + (-1) * AvgUserTurn

  21. EXPERIMENTAL RESULTS The average score of different methods Lee et al., (2009), Hybrid Approach to Robust Dialog Management using Agenda and Dialog Examples, CSL, (Submitted)

  22. EXPERIMENTAL RESULTS The average score of the P-EAR system according to n-best size Lee et al., (2009), Hybrid Approach to Robust Dialog Management using Agenda and Dialog Examples, CSL, (Submitted)

  23. DEMO VIDEO • PC demo

  24. DEMO VIDEO • Robot demo

  25. 2. Feedback Generation

  26. INTRODUCTION Tutoring Process Recast Feedback Clarification Request Tutor: ---------- User: ---------- Tutor: ---------- User: ---------- > Expression > Description … >Korean EXP > English EXP … Recast Feedback Tutor: What is the purpose of you trip? User: My purpose business Tutor: Sorry, I don’t understand. What did you say? User: I am here on business Try this expression “I am here on business” Learner Uptake User Input

  27. INTRODUCTION Tutoring Process Expression Suggestion TIMEOUT Tutor: ---------- User: ---------- Tutor: ---------- User: ---------- > Expression > Description … >Korean EXP > English EXP … Expression Suggestion Tutor: What is the purpose of you trip? Tutor: Sorry, I can’t hear you. User: I am here on business Try this expression “I am here on business” Learner Uptake User Input

  28. PROBLEMS • How to recognize user intentions despite numerous errors in their utterances • The mal-rule based technique used in previous studies doesn’t work on low level learners due to multiple errors • Some utterances even seem to have a meaning that differs from what they intended to say • Intended meaning : When does the bus leave? • learner’s utterance : Which time I have to leave? • How to choose appropriate user intentions to suggest when a timeout is expired • The system should take into consideration the dialog context as human tutors do • Performing Intention-based soft pattern-matching to generate correct feedback

  29. MATHODS • Context-aware & Level-specific Intention Recognition • Intention-based pattern matching Learner’s Utterance Dialog State Example Expresssion DB Level 1 Data Level 1 Utterance Model Example Search Dialog State –based Model Dialog State Update Level 2 Data Level 2 Utterance Model Example Expressions Level N Data Level N Utterance Model Pattern Matching Intention Recognizer Dialog Manager Feedback Learner‘s Intention

  30. EXPERIMENT SET-UP • Primitive data set • Immigration domain • 192 dialogs, 3517 utterances (18.32 utt/dialog) • Annotation • Manually annotated each utterance with the speaker’s intention and component slot-values • Automatically annotated each utterance with the discourse information

  31. EXPERIMENTAL RESULTS Utterance Model Hybrid Model

  32. EXPERIMENTAL RESULTS Level-spec Hybrid Level-ignore Hybrid Level-spec Utterance Level-ignore Utterance

  33. EXPERIMENTAL RESULTS

  34. Demo: POSTECH DB-CALL initial version 2008

  35. 3. Translation Assistance

  36. Architecture Example format <parallel> <source>~~~~~~~</source> <target>~~~~~~~~</target> </parallel> <Alignment Info> <s2t>~~~~~~~~</s2t> <t2s>~~~~~~~~</t2s> <composition>~~~~<composition> <Additional> <url>~~~~~~</url> <parallel> <source>~~~~~~~</source> <target>~~~~~~~~</target> </parallel> <Alignment Info> <s2t>~~~~~~~~</s2t> <t2s>~~~~~~~~</t2s> <composition>~~~~<composition> <Additional> <url>~~~~~~</url> <parallel> <source>~~~~~~~</source> <target>~~~~~~~~</target> </parallel> <Alignment> <s2t>~~~~~~~~</s2t> <t2s>~~~~~~~~</t2s> <composition>~~~~</composition> </Alignment> <Additional> <url>~~~~~~</url> </Additional> Web Extraction Parallel Sentence Example Analysis Search Engine Query Expression Interface (function call) ESL Dialog system / Other Applications

  37. Building Bilingual Example • Word alignment • Widely used in Statistical Machine Translation • IBM Model 1~5, Symmetrization heuristics • Word alignment presents a correspondence of each word/phrase in a given bilingual example • Example word alignment ( GIZA++ )

  38. 4. Comprehension Assistance

  39. INTRODUCTION • English Expression-Description Example Suggestion System • When the user asks for a unfamiliar English expression, the system present its description to help understanding ESL pobcast website Description Suggestion System Dialog System sentence Expression detection description Recommend Expression-description DB

  40. INTRODUCTION • Expression-Description Pair Extraction System • To present the expression example and its description, the system extracts expression-description pair from ESL podcast site

  41. EXAMPLE [script] [description]

  42. EXAMPLE [script] [description]

  43. Language Learner Simulation

  44. 1. User Simulation

  45. INTRODUCTION • User Simulation For Spoken Dialog System • Developing `simulated user’ who can replace real users • Application • Automated evaluation of Spoken Dialog System • Detecting potential flaws • Predicting overall behaviors of system • Learning dialog strategy in reinforcement learning framework

  46. PROBLEM & GOAL • PROBLEM • How to model real user • User Intention simulation • User Surface simulation • ASR channel simulation • GOAL • Natural Simulation • Diverse Simulation • Controllable Simulation

  47. IDEA – User Intention Simulation • Dialog is sequential behaviors • Especially, user intention • User Intention simulation should take care of various discourse information User User User Sys Sys Sys Discourse Factors + Knowledge + Events … Jung et al., 2009, Data-driven user simulation for automated evaluation of spoken dialog systems, Computer Speech and Language.

  48. User Intention Simulation- Linear Conditional Random Field model Turn Turn Turn Turn UI UI UI UI DI DI DI DI • Assumption • An user utterance has only one intention • UI : User Intention State • State=[dialog_act, main_goal, named_entities] • DI : Previou Discourse Information • System Response + Discourse History Jung et al., 2009, Data-driven user simulation for automated evaluation of spoken dialog systems, Computer Speech and Language.

  49. ALGORITHM Jung et al., 2009, Data-driven user simulation for automated evaluation of spoken dialog systems, Computer Speech and Language.

More Related