1 / 26

Information Retrieval Interaction

Information Retrieval Interaction. CMSC 838S Douglas W. Oard April 27, 2006. Process/System Co-Design. IR System. Query Formulation. Query. Search. Collection Indexing. Index. Collection Acquisition. Collection. System Design. Source Selection. Ranked List. Document

kasie
Download Presentation

Information Retrieval Interaction

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Information Retrieval Interaction CMSC 838S Douglas W. Oard April 27, 2006

  2. Process/System Co-Design

  3. IR System Query Formulation Query Search Collection Indexing Index Collection Acquisition Collection System Design Source Selection Ranked List Document Selection Document Document Examination Document Document Delivery

  4. Information Needs (Taylor) • Visceral • What you really want to know • Conscious • What you recognize that you want to know • Formalized • How you articulate what you want to know • Compromised • How you express what you want to know to a system

  5. Anomalous State of Knowledge • Belkin: Searchers do not clearly understand • The problem itself • What information is needed to solve the problem • The query results from a clarification process • Dervin’s “sense making”: Need Gap Bridge

  6. Selection/Examination Tasks • “Indicative” tasks • Recognizing what you are looking for • Determining that no answer exists in a source • Probing to refine mental models of system operation • “Informative” tasks • Vocabulary acquisition • Concept learning • Information use

  7. A Selection Interface Taxonomy • One dimensional lists • Content: title, source, date, summary, ratings, ... • Order: retrieval status value, date, alphabetic, ... • Size: scrolling, specified number, score threshold • Two dimensional displays • Construction: clustering, starfield, projection • Navigation: jump, pan, zoom • Three dimensional displays • Contour maps, fishtank VR, immersive VR

  8. Google: KeyWord In Context (KWIC) Query: University of Maryland College Park

  9. U Mass: Scrollbar-Tilebar

  10. NPR Online

  11. Ask: Suggested Query Refinements

  12. Vivisimo: Clustered Results

  13. Kartoo’s Cluster Visualization

  14. Semantic Maps • PNL Themescape/Themeview (1995), http://www.pnl.gov/infoviz/technologies.html • Sandia Vx-Insight (1996), http://www.cs.sandia.gov/projects/VxInsight.html • Arizona ET-MAP (1998), http://ai.eller.arizona.edu/research/dl/etmapdemo.htm • Some challenges: • Region labeling • Point labeling

  15. Pacific Northwest Lab: ThemeView Credit to: Pacific Northwest National Laboratory

  16. LTU ImageSeeker

  17. Glasgow: “Ostensive” Browser

  18. CMU: Television News Retrieval

  19. Maryland: Baltimore Learning Community

  20. Ben’s ‘Seamless Interface’ Principles • Informative feedback • Easy reversal • User in control • Anticipatable outcomes • Explainable results • Browsable content • Limited working memory load • Query context • Path suspension • Alternatives for novices and experts • Scaffolding

  21. My ‘Synergistic Interaction’ Principles • Interdependence with process (“interaction models”) • Co-design with search strategy • Speed • System initiative • Guided process • Exposing the structure of knowledge • Support for reasoning • Representation of uncertainty • Meaningful dimensions • Synergy with features used for search • Weakness of similarity, Strength of language • Easily learned • Familiar metaphors (timelines, ranked lists, maps)

  22. Guidelines for Practice • Show the query in the selection interface • It provides context for the display • Explain what the system has done • It is hard to control a tool you don’t understand • Highlight search terms, for example • Complement what the system has done • Users add value by doing things the system can’t • Expose the information users need to judge utility

More Related