1 / 87

Introduction to dialogue systems (part II)

Introduction to dialogue systems (part II). Staffan Larsson Dialogsystem HT04. Overview. Why Develop Speech Applications for the Telephone (Larson ch.1) Dialogue and dialogue genres Dialogue modeling and dialogue systems Research areas & local projects History of dialogue systems

ismail
Download Presentation

Introduction to dialogue systems (part II)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Goteborg University Dialogue Systems Lab Introduction to dialogue systems(part II) Staffan Larsson Dialogsystem HT04

  2. Goteborg University Dialogue Systems Lab Overview • Why Develop Speech Applications for the Telephone (Larson ch.1) • Dialogue and dialogue genres • Dialogue modeling and dialogue systems • Research areas & local projects • History of dialogue systems • Methodology for dialogue systems design • (Agenter, dialog och talakter) • (Dialogspel)

  3. Goteborg University Dialogue Systems Lab Research areas & local projects

  4. Goteborg University Dialogue Systems Lab Problemområden & teorierformell pragmatik för dialogsystem • hantering av dialogstruktur • dialogspel • talakter • implicit information • presupposition • implikatur • planigenkänning • relatera explicit & implicit information till kontext; uppdatera kontext • pronomenlösning (DRT, Centering Theory, abduktion) • planigenkänning • accommodation • välj/planera yttrande • planering • implicit information? • kommunikationshantering • ICM, OCM • grounding • konversationsanalys

  5. Goteborg University Dialogue Systems Lab Dialogue phenomena areas • Recognition • Interpretation • Dialogue management • incl. ”low-level” communication management • Generation • Synthesis • (which ones solved? Which ones are we working on? Which ones are others working on?)

  6. Goteborg University Dialogue Systems Lab Recognition • Use contextual information to improve recognition • Multiple simultaneous grammars • Reordeing N-best list based on infostate • Combining statistical and grammar-based recognition • e.g. backing off to SLM • Automatic generation of recognition grammars • Recognition of unknown words • Utilizing speaker-dependent recognition when possible • Improving acoustical models and language models during dialogue • Outputting word by word

  7. Goteborg University Dialogue Systems Lab Interpretation • Context-independent • Robust parsing • Underspecified semantics • Computing presuppositions • Multilinguality • Context-dependent • Pronoun resolution • Ellipsis resulotion • Deixis resolution • Indirect speech acts • Computing implicatures • Multimodality • Combining speech and gesture • Communication management • Understanding communication management moves (feedback, sequencing, turntaking)

  8. Goteborg University Dialogue Systems Lab Dialogue management • Dealing with user giving more or different information than requested • Information sharing between tasks • Multiple possible tasks • Multiple simultaneous tasks • Jumping back and forth between tasks • Discussing multiple alternative solutions to a problem • Arguing for and against alternatives

  9. Goteborg University Dialogue Systems Lab Dialogue management (cont’d) • Asking and answering questions • Requesting actions and reporting on the status of onging actions • Planning joint activities • Interpreting ambiguous utterances • Linking utterances with the context • Asking clarification questions

  10. Goteborg University Dialogue Systems Lab Communication management • Selecting appropriate feedback moves • Dealing with misrecognition, misunderstanding, rejection • Dealing with user’s feedback moves • Dealing with turntaking • Distinguish positive feedback from interruptions • Selecting appropriate sequencing moves • Dealing with user’s sequencing moves • Dealing with user self-corrections • System correcting itself if necessary • E.g., a better interpretation of the user’s previous utterance is reciveded

  11. Goteborg University Dialogue Systems Lab Generation • Deep (”what to say”) vs.surface generation (”how to say it”) • Context independent • Keeping track of presuppositions • Context dependent • Information structure • Generating ellipsis • Generating pronouns • Keeping track of implicatures • Multimodality • Combining speech and gesture

  12. Goteborg University Dialogue Systems Lab Synthesis • Intonation • Prosody in general • Shouting etc • Emotional speech • Realistic voice quality • Knowing how much has been said • for turn-taking • Reasonable time consumption • cacheing

  13. Goteborg University Dialogue Systems Lab Overall • Dealing with underspecified information • Incremental interpretation and dialogue management • Speed (real-time) • Learning • User adaptation • Acoustical model • Offline informational model • Online informational user model

  14. Goteborg University Dialogue Systems Lab Dialogue systems research in Göteborg: themes • Early 90’s: Pragmatics-based Language Understanding (Allwood) • Late 90’s - present: Information state update approach • Dialogue moves • Abstract representations of utterances • Transitions between information states • Flexible dialogue • TrindiKit: a dialogue systems toolkit • Early 00’s - present: Issue-based dialogue management • A generic theory of dialogue, implemented using TrindiKit • Basic idea: dialogue is driven by explicit and implicit questions (issues) • PhD thesis: Larsson 2002 • Present (current!): Stream-based dialogue management (Lager)

  15. Goteborg University Dialogue Systems Lab Swedish projects • SDS (Swedish Dialogue Systems, 98-00) • With Linköping, Telia and others. • ILT (Interactive Language Technology, 01-04) • Cooperation with Computer Science, Chalmers • Application: programming a computerized video recorder via telephone

  16. Goteborg University Dialogue Systems Lab EU-Projekt • PLUS (Pragmatics-based Language Understanding, c:a 91) • TRINDI (Task Oriented and Instructional Dialogue, 97-00) • Development of the Information State Approach • TrindiKit, GoDiS • SIRIDUS (Specification, Interaction and Reconfigurability in Dialogue Understanding Systems, 00-02) • Continuation of TRINDI; further development • D’Homme (Dialogue in the Home Environment, 01) • The intelligent home • TALK (Talk and Look, Tools for Ambient Intelligence, 03-06) • Extending the IS approach to multimodal and multilingual dialogue • Scenarios • in-car • in-home

  17. Goteborg University Dialogue Systems Lab A short history of dialogue systems

  18. Goteborg University Dialogue Systems Lab The Turing test • Can a machine be intelligent? Is ”artificial intelligence” (AI) possible? • Turing offers an operational definition of intelligence • Turing (1912-1954): ”the Turing test” • Test person A has a dialogue (via a text terminal) with B. • A:s goal is to decide whether B is a human or a machine • If B is a machine and manages to deceive A that B is a human, B should be regarded as intelligent (able to think; ”a grade A machine”) • (This is a simplified version of the Turing test)

  19. Goteborg University Dialogue Systems Lab The Turing test and dialogue • According to the Turing test – what is fundamentally human? • The ability to carry out a dialogue using natural language • Why is this fundamental? • Assumption: In dialogue, all other human capabilities show themselves (directly or indirectly) • This means that ... • ... in order to make a computer use natural language in the same way and on the same level as a human, it needs to be endowed with human-level intelligence

  20. Goteborg University Dialogue Systems Lab Artificial Intelligence • Goal • simulate human/intelligent behaviour/thinking • Weak AI • Machines can be made to act as if they were intelligent • Strong AI • Agents that act intelligently have real, conscious minds • It is possible to believe in strong AI but not in weak AI

  21. Goteborg University Dialogue Systems Lab Cognitivism and GOFAI • Descartes: • Understanding and thinking is forming and using symbolic representations • Until the mid-80’s, the paradigm of AI was cognitivism, the idea that thinking is, essentially, symbol manipulation • The physical symbol hypothesis (Newell & Simon): • ”A physical symbol system has the necessary and sufficient means for intelligent action.” • All intelligent behaviour can be captured by a system that reasons logically from a set of facts and rules that describe the domain • This is sometimes referred to as GOFAI (Good Old Fashioned AI)

  22. Goteborg University Dialogue Systems Lab Dialogue systems as GOFAI? • Since around 1986, GOFAI has been abandoned by many AI researchers • Instead, focus on connectionism, embodied interactive automata, reinforcment learning, probabilistic methods, etc. • However, a large part of current dialogue systems research adheres to the GOFAI paradigm • Information States, for example… • Why? • It seems to be the most workable method for the complex problems of natural language dialogue • It appears to be useful for improving on current human-computer interfaces, although a major breakthrough of NL interfaces is needed to prove this conclusively • But is it also a step on the way towards ”human-level” natural language understanding in computers? • Does it scale up?

  23. Goteborg University Dialogue Systems Lab AI paradigms • Cognitivism / GOFAI • physical symbol hypothesis • GPS (Newell & Simon) • Still the dominant type of system • ”Emergent AI” • Connectionism • Reinforcement learning • Interactive AI; embodied automata (Brooks, ...) • Society of Mind (Minsky)

  24. Goteborg University Dialogue Systems Lab Definition of ”dialogue system”? • System able to participate in producing coherent sequences of natural language utterances (taking ”coherent” with a grain of salt) • Not ”one-shot” systems • where previous dialogue cannot affect interpretation of current utterance • e.g. most NL database interfaces • even Eliza can return to previous topics!

  25. Goteborg University Dialogue Systems Lab ”Classic” systems 1966-1972 • ELIZA (Weizenbaum 1966) • what we would today call a chatterbot; the same goes for PARRY • PARRY (Colby 1973) • SHRDLU (Winograd 1972) • Perhaps the first real dialogue system

  26. Goteborg University Dialogue Systems Lab SHRDLU • ”Blocks world”

  27. Goteborg University Dialogue Systems Lab • Person: PICK UP A BIG RED BLOCK. • Computer: OK. (does it) • Person: GRASP THE PYRAMID. • Computer: I DON'T UNDERSTAND WHICH PYRAMID YOU MEAN. • Person: FIND A BLOCK WHICH IS TALLER THAN THE ONE YOU ARE HOLDING AND PUT IT INTO THE BOX. • Computer: BY "IT", I ASSUME YOU MEAN THE BLOCK WHICH IS TALLER THAN THE ONE I AM HOLDING. • Computer: OK. (does it) • Person: WHAT DOES THE BOX CONTAIN? • Computer: THE BLUE PYRAMID AND THE BLUE BLOCK. • Person: WHAT IS THE PYRAMID SUPPORTED BY? • Computer: THE BOX. • Person: HOW MANY BLOCKS ARE NOT IN THE BOX? • Computer: FOUR OF THEM. • Person: IS AT LEAST ONE OF THEM NARROWER THAN THE ONE WHICH I TOLD YOU TO PICK UP? • Computer: YES, THE RED CUBE. • Person: IS IT SUPPORTED? • Computer: YES, BY THE TABLE. • Person: CAN THE TABLE PICK UP BLOCKS? • Computer: NO. • Person: CAN A PYRAMID BE SUPPORTED BY A BLOCK? • Computer: YES. • Person: CAN A PYRAMID SUPPORT A PYRAMID? • Computer: I DON'T KNOW. • Person: STACK UP TWO PYRAMIDS. • Computer: I CAN'T. • Person: WILL YOU PLEASE STACK UP BOTH OF THE RED BLOCKS AND EITHER A GREEN CUBE OR A PYRAMID? • Computer: OK. (does it) • Person: WHICH CUBE IS SITTING ON THE TABLE? • Computer: THE LARGE GREEN ONE WHICH SUPPORTS THE RED PYRAMID. • Person: IS THERE A LARGE BLOCK BEHIND A PYRAMID? • Computer: YES, THREE OF THEM: A LARGE RED ONE, A LARGE GREEN CUBE, AND THE BLUE ONE. • Person: PUT A SMALL ONE ONTO THE GREEN CUBE WHICH SUPPORTS A PYRAMID. • Computer: OK. (does it) • Person: PUT THE LITTLEST PYRAMID ON TOP OF IT. • Computer: OK. (does it)

  28. Goteborg University Dialogue Systems Lab • Person: DOES THE SHORTEST THING THE TALLEST PYRAMID'S SUPPORT SUPPORTS SUPPORT ANYTHING GREEN? • Computer: YES, THE GREEN PYRAMID. • Person: WHAT COLOR IS IT? • Computer: BY "IT", I ASSUME YOU MEAN THE SHORTEST THING THE TALLEST PYRAMID'S SUPPORT SUPPORTS. • Computer: RED • Person: HOW MANY THINGS ARE ON TOP OF GREEN CUBES? • Computer: I'M NOT SURE WHAT YOU MEAN BY "ON TOP OF "IN THE PHRASE "ON TOP OF GREEN CUBES " . •  DO YOU MEAN: • 1 - DIRECTLY ON THE SURFACE • 2 - ANWHERE ON TOP OF ? • Person: 2 • Computer: THREE OF THEM. • Person: HAD YOU TOUCHED ANY PYRAMID BEFORE YOU PUT THE GREEN ONE ON THE LITTLE CUBE? • Computer: YES, THE GREEN ONE. • Person: WHEN DID YOU PICK IT UP? • Computer: WHILE I WAS STACKING UP THE RED CUBE, A LARGE RED BLOCK AND A LARGE GREEN CUBE. • Person: WHY? • Computer: TO GET RID OF IT. • Person: WHY DID YOU DO THAT? • Computer: TO CLEAN OFF THE RED CUBE. • Person: WHY DID YOU CLEAR OFF THAT CUBE? • Computer: TO PUT IT ON A LARGE GREEN CUBE. • Person: WHY DID YOU DO THAT? • Computer: TO STACK UP THE RED CUBE , A LARGE RED BLOCK AND A LARGE GREEN CUBE. • Person: WHY DID YOU DO THAT? • Computer: BECAUSE YOU ASKED ME TO. • Person: HOW DID YOU DO IT? • Computer: BY PUTTING A LARGE RED BLOCK ON THE TABLE ; THEN LETTING GO OF IT ; THEN PUTTING A LARGE GREEN CUBE ON IT ; THEN LETTING GO OF THAT CUBE ; THEN PUTTING THE RED CUBE ON THAT CUBE; THEN LETTING GO OF THAT CUBE. • Person: HOW MANY OBJECTS DID YOU TOUCH WHILE YOU WERE DOING IT? • Computer: FOUR OF THEM.

  29. Goteborg University Dialogue Systems Lab • Person: WHAT DID THE RED CUBE SUPPORT BEFORE YOU STARTED TO CLEAN IT OFF? • Computer: THE GREEN PYRAMID. • Person: THERE WERE FIVE BLOCKS TO THE LEFT OF THE BOX THEN. • Computer: NO, ONLY FOUR OF THEM: THE RED CUBE, TWO LARGE GREEN CUBES AND A LARGE RED BLOCK. • Person: PUT THE BLUE PYRAMID ON THE BLOCK IN THE BOX. • Computer: OK. (does it) • Person: IS THERE ANYTHING WHICH IS BIGGER THAN EVERY PYRAMID BUT IS NOT AS WIDE AS THE THING THAT SUPPORTS IT? • Computer: YES, THE BLUE BLOCK. • Person: DOES A STEEPLE • Computer interrupting: SORRY, I DON'T KNOW THE WORD "STEEPLE". • Person: A "STEEPLE" IS A STACK WHICH CONTAINS TWO GREEN CUBES AND A PYRAMID. • Computer: I UNDERSTAND. • Person: ARE THERE ANY STEEPLES NOW? • Computer: NO. • Person: BUILD ONE. • Computer: OK. (does it) • Person: CALL THE BIGGEST BLOCK "SUPERBLOCK". • Computer: OK. • Person: HAVE YOU PICKED SUPERBLOCK UP SINCE WE BEGAN? • Computer: YES. • Person: WHY DID YOU DROP IT? • Computer: BY "IT", I ASSUME YOU MEAN THE SUPERBLOCK. •  I CAN'T EXPLAIN A NON-EXISTENT EVENT. • Person: IS THERE ANYTHING TO THE RIGHT OF THE RED PYRAMID? • Computer: YES, FOUR THINGS: THE BOX , THE RED CUBE, THE BLUE PYRAMID AND THE BLOCK NAMED SUPERBLOCK • Person: THANK YOU. • Computer: YOU'RE WELCOME!

  30. Goteborg University Dialogue Systems Lab Dialogue systems 1978-1986 • GUS (Bobrow 1978) - travel agent, collaborative planning • frame tree, sentence fragments • OSCAR (Cohen 1978) • speech act planning • MICS (Carbonell 1979/1981) - political and social interaction(!) • ARGOT (Allen 1982) - computer operator • dialog grammar, plan recognition • HAM-ANS (Hoppner et al 1983) - portable interface • XCALIBUR (Carbonell et al 1983) – expert system interface • case frames, ellipsis resolution, error recovery • Fred/Doris (Houghton 1986 thesis) • dialogue games (Power), speech acts, and planning • agents in microworld

  31. Goteborg University Dialogue Systems Lab GUS • GUS: The Genial Understanding System • Developed in the mid 1970s at Xerox PARC • Aims to provide a cooperative dialogue • Restricted domain of discourse: travel agent with the goal of booking a return trip • ‘Frame-based’: has expectations about what is required to complete the call; today we’d call this ‘information-state based’ • Simulated speech input and output via a text interface

  32. Goteborg University Dialogue Systems Lab A Sample Dialogue GUS: Hello. My name is GUS. I can help you plan a simple trip by air. Where do you want to go? Client: I want to go to San Diego on May 28. GUS: What time do you want to leave? Client: I must be in San Diego before 10am. GUS: Would you like PSA flight 102 that arrives at 915am? Client: What is the next flight? GUS: Air California flight 310 that leaves at 830am and arrives at 1000am. Client: I’ll take the first one. GUS: In what name should I make a reservation, please? Client: Dan Bobrow. GUS: I have confirmed the following flight: PSA flight 102 on Wednesday May 28 from San Jose to San Diego that leaves at 730am and arrives at 915am. What date do you want to return on? Client: On Friday in the evening. GUS: Would you like the flight that leaves at 745pm? Client: That’s fine. GUS: I have confirmed the following flight: PSA flight 307 on Friday May 30 from San Diego to San Jose that leaves at 745pm and arrives at 930pm. Thank you for calling. Goodbye.

  33. Goteborg University Dialogue Systems Lab 1988-1992 • Unix Consultant (Wilensky et al 1988) • predict user expertise level • IREPS (Carberry 1988) - information seeking • plan structure, predict user goals • MINDS (Young et al 1989) – train timetable database • goal trees, hand-coded domain knowledge • VODIS (1989) - train timetable database • ”object frames”, speech • ESTEAM-316 (Jullien & Marty 1989) financial advisor • dialogue plans, plan recognition • SunDial (Peckham 1991) • task-structured history, belief model, speech • Circuit Fix-it (Smith & Hipp 1992) • ”missing axiom theory”, speech • TRAINS (Allen 1993) • joint planning • Trains project: http://www.cs.rochester.edu/research/trains/ • (HearSay?)

  34. TRIPS (Allen) Galaxy Communicator Collagen (Sidner) RavenClaw WildFire VerbMobil SmartKom ARISE AutoTutor WITAS (Lemon) CONVERSE EDIS (Traum 1998) MIDAS (Bos 1998) GoDiS (Larsson 2002) Beetle (Zinn, Moore) Mission Rehearsal Exercise (Traum) Frameworks TrindiKit DIPPER DARPA Communicator Goteborg University Dialogue Systems Lab Other ”modern” systems (still active)(see also http://www-2.cs.cmu.edu/~dbohus/SDS/ - 47 systems)

  35. Goteborg University Dialogue Systems Lab Not quite dialogue systems • Database query systems • Text understanding systems • Chatterbots

  36. Goteborg University Dialogue Systems Lab Database Query systems(http://www-personal.umich.edu/~abney/ling492/systems.html + smith & hipp)Also read http://www-personal.umich.edu/~abney/ling492/1961.pdf • Ask (Thompson & Thompson 1983) • Baseball (Green et al 1961, 1963) • Chat-80 (Warren & Pereira 1982) • Co-op (Kaplan 1982) - two domains • detection of invalid presuppositions • Core Language Engine (Alshawi et al 1992) • Datalog (Hafner & Godden 1985) – multi-domain • DIALOG (Bolc et al 1985) – medical database • Intellect (Harris 1984) • Janus • Ladder - SRI (Hendrix et al 1978) • naval info • LanguageAccess (IBM) (Ott 1992) • Loqui • Lotus HAL • Lunar (Woods 1973, 1978) • Philiqa1 • Planes - Waltz • Pragma (Levine 1990) • user goal recognition and prediction • PSLI3 (Frederking 1988) • medical database • Q&A (Symantec) • Rendezvous (Codd 1974) • Rus, Irus, Parlance (BBN) • ROBOT (Harris 1977) • TEAM (Diagram, Dialogic) (Grosz et al 1987) • multi-domain • TINA (Seneff 1992) • 2 domains

  37. Goteborg University Dialogue Systems Lab Text understanding systems • SAM (Shank & Abelson) • an attempt at formalising everyday background knowledge needed for interpretation of simple stories • Discourse system (Allen et al 1989) • advisory dialogue interpretation

  38. Goteborg University Dialogue Systems Lab Chatterbots(http://www.simonlaven.com) • Shampage, a brilliant Chatterbot program whose language base is totally configurable. You can set it up anyway you want it, and if you set it up well enough it can become a truly amazing program. • Eliza, the virtual psychoanalyst. Originally created by MIT scientist Joseph Weizembaum. Several versions available here along with backround information on the worlds most famous chatterbot. • Fred, the Functional Response Emulation Device. The first program in an ongoing experiment to explore Natural Language communications between people and computer programs. • Claude, when the author released this program to the world the 'readme.doc' said: "Claude isn't very smart compared to you". This is true, but he's also a clone of the classic Racter Chatterbot.

  39. Goteborg University Dialogue Systems Lab Svenska ”dialogsystem” / röststyrda tjänster som kan nås via telefon • Bilregistret (SpeechCraft): 077-114 15 16 • SAS SpeechLine (SpeechCraft): 0770-727 888 • SJ:s automatiska tågtidsupplysning (Presector): 0771-75 75 75, välj 1 • Sjöfartsverkets sjöväderupplysning (SpeechCraft): 08-612 54 40 • Storstockholms Lokaltrafiks trafikupplysning (SpeechCraft): 08-600 10 00 • Telias nummerupplysning Autosvar (Presector): 118 888 • teliamobiles röststyrda aktietjänst (SpeechCraft): 4444 (endast för teliamobile-kunder) • Västernorrlands länstrafik / Din Tur (SpeechCraft): 0771-511 513, välj sedan 1 • Västtrafiks röststyrda tidtabellsupplysning (Sigma): 0771-41 43 00, välj 1

  40. Goteborg University Dialogue Systems Lab Methodology for dialogue system design

  41. Goteborg University Dialogue Systems Lab General R&D cycle • decide on initial framework, system, domain • domain activity communication analysis • corpus collection and analysis, or other starting point (e.g. existing menu-based system) • (re)design of framework, system and / or application • (re)implementation of the above • user testing • go to 2 until satisfied or out of funds

  42. Goteborg University Dialogue Systems Lab Corpus collection and analysis • Collection • Natural dialogue • Wizard-of-Oz (WoZ) • Backseat Driver (BaD?) • Analysis • Transcription • Distillation • Coding

  43. Goteborg University Dialogue Systems Lab Naturlig dialog • Illustrerar verkliga mål & behov • Den person som har samma roll som systemet beter sig inte som systemet skulle ha gjort -> mer komplex dialog • Ej klart att att anv. har samma förväntningar på ett system • Annat språk används mot människa än mot dator?

  44. Goteborg University Dialogue Systems Lab Wizard of Oz • Anv. tror att han interagerar med dator men egentligen är det en mamnniska som härmar. • A får en uppgift att utföra (ett scenario); • W har tillgång till ett bakgrundssystem, tex en databas; W kommunicerar via talsyntes eller text, ev speciellt simuleringsverktyg • Användbart? • WOZ mäter fördomar; människor kommer att anpa ssa sig till hur system faktiskt beter sig • m-d dialoger rör sig i enklare domäner -> enklare dialoger ; Varför på förhand avgränsa systemet till enkla dialoger? • Ekologisk validitet • svårt realistiskt härma, människor stavar fel, skriver långsamt osv; kräver mycket arbete • "rollspel"- ej realistiskt beteende från användaren • Människor beter sig annorlunda mot en maskin än mot en människa; t ex inget behov av artighet; kanske inga indirekta talakter • Eller vanemässig & omedveten artighet & indirekthet? • Oetiskt? Fungerar bara så länge metoden & state of the art är okänd för allmänheten.

  45. Goteborg University Dialogue Systems Lab Destillering • skriva om m-m-dialoger så att de liknar m-d-dialoger • guidelines, skiss över systemets beteende • Används i realistisk situation • Problem: • arbetskrävande • subjektivt; resultat beror delvis av destillatören • problem med att hålla dialogen koherent

  46. Goteborg University Dialogue Systems Lab Annotation and analysis • Dialogue move / speech act annotation • May require domain analysis • Coding schemas - reliability - kappa • IS-kodning • MDI-analys av uppgift?

  47. Goteborg University Dialogue Systems Lab Evaluering av dialogsystem • ”Objektiva” kriterier • (e.g. PARADISE framework) • tid det tar att utföra uppgift • antal turer system/användare • andel rättelser • transaction success • Subjektiva • intervju, frågeformulär • användarvänlighet, naturlighet, klarhet, användarvänlighet, vänlighet, robusthet • Dialogue capabilities (e.g. TRINDI ticklist)

  48. Goteborg University Dialogue Systems Lab Agenter(Wooldridge & Jennings)

  49. Goteborg University Dialogue Systems Lab Vad är en (artificiell) agent? • beteendebaserad defintion • autonomi: • agenter handlar utan direkt inblandning av människor eller andra, och har kontroll över sina egna handlingar och sitt eget interna tillstånd • social förmåga: • agenter interagerar med andra agenter (inkl. människor), bl a med hjälp av språk • reaktivitet: • agenter uppfattar sin omgivning (den fysiska världen, ett grafiskt användarinterface, internet...) och reagerar på förändringar i omgivningen • proaktivitet: • aganter reagerar inte bara på omgivningen, utan är också kapabla till målinriktat beteende och kan ta initiativ

  50. Goteborg University Dialogue Systems Lab Två huvudtyper av ramverk för artificiella agenter • ”Deliberative” • en agent har en explicit representerad symbolisk modell av världen • beslut fattas genom logiskt slutledning (mönstermatchning, symbolmanipulation) • teoribaserade • Exempel: General Problem Solver (Newell & Simon) • Reaktiv • ingen symbolisk modell • ingen komplex symbolprocessning • Exempel: situerade finita automater (Rosenschein & Kaelbling) • tenderar att vara ad hoc • det finns ocskå hybridteorier • ett reaktivt och ett deliberativt lager • Är människor reaktiva eller deliberativa? Eller kanske hybrider...

More Related